What impact did the various colonies have in forming what became the United States?

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

This is really a vast question that would take many books to fully answer.  Let me provide a few ideas here.

First, some of the colonies were more important than others in forming what would become the culture of the United States.  One can argue that the New England colonies were the most influential here.  From them came the "Protestant work ethic" that instilled the idea that to be American is to be hard-working and (until recently) frugal and sensible.

Second, some parts of various colonies were important in creating the individualistic "frontier" ethos that is so much a part of America.  These were the backwoods areas of many of the colonies.  The people who lived there were generally the poorer people who could have no hope of being among the elite in their time.  They were the ones who were constantly pushing westward, trying to carve out spaces where they could live without being under the thumbs of the elite.  This idea of rugged individualism has alos come to influence our culture.

Finally, the Southern colonies were instrumental in creating the multi-racial society we have today with all its costs and benefits.  The slave systems that they built have had a lasting impact.  It is unlikely that America would have such a large African-American population as we have had it not been for these slave states.  Having a racially diverse population can be seen as a good or a bad thing (and I say that as a person of color myself, not to be racist but to simply state that one can argue that a racially homogeneous society of whatever race may be more stable than a diverse one), but it is clearly a major fact about the United States.

We’ve answered 318,910 questions. We can answer yours, too.

Ask a question