What role did the West play in creating an interdependent and self suffiecient nation during the 19th century?
1 Answer | Add Yours
The USA's Westward expansion, and the 'creation' of many news states in the West as a result, had many consequences; and at a number of levels, we could say that the West --it's opening up and settlement, played a significant role in making America confident, strong, self-reliant, economically self-sufficient and also interdependent as the various parts/component states and regions of the USA began to feel in the 19th century a sense of nationhood and linkage, transcending or going beyond just narrow areas and state boundaries.
I think we can look up and find these linkages in depth by taking a closer look at least at these following levels and linkages:
1. The Westward Expansion and its economic and political consequences especially for new settlers and immigrants moving in from the East.
2. The strengthening of the US economy via wealth/resources generated by the new Western states/regions, and also the large scale agriculture and farming that helped considerably in making the US self-sufficient in many products and goods etc
3. The further expansion/expansionist ambitions that thus also developed amongst Americans who now felt themselves strong and powerful and able to take a 'leading role' in the Americas (North and South) economically, politically, socially, morally and who were thus able to eventually dominate and even colonise various areas.
Without going into any detailed evaluation of the 'rights' and 'wrongs' and so on, of such expansionism and growth (which youll need to do as part of your own detailed study) one could say that the United States as a result of its Westerwards expansionism became stronger and more prosperous and the West also i turn depended on other parts of the US and thus, the 'interlinkages and sense of nationalism and national pride became linked to another larger abstract concept or idea i.e. that of 'Manifest Destiny'. This idea/concept proposed that the United States had, in fact, a clear and strong 'pre destined' role to expand and grow and dominate. Obviously, with such an idea in mind and with the economic and political benefits of Westward growth/expansion, all Americans became better off and more confident and self-assured and determined to prove their 'greatness' as a nation.
Join to answer this question
Join a community of thousands of dedicated teachers and students.Join eNotes