1 Answer | Add Yours
Westward expansion was a very good thing for the United States. It gave the country a stronger and bigger economy, made it more of a military power, and even (arguably) made it more democratic. Of course, Native Americans were devastated by this expansion, but the United States benefitted.
Westward expansion made the country richer. First of all, it helped drive the Industrial Revolution in the US. As railroads expanded across the country, they helped drive industrialization by, for example, increasing the need for steel production. Perhaps even more importantly, westward expansion gave the US more resources. It gave the US the agricultural production of the Great Plains. It gave the US the mineral wealth from places like Colorado, Nevada, and Idaho. These things helped enrich the country.
By spreading from “sea to shining sea” the US became stronger. It no longer had to worry about the possibility of having a hostile foreign power on the same land mass. It was positioned to be a force both in the Pacific and the Atlantic. Thus, westward expansion also increased American military power and potential.
Finally, many scholars have argued that expansion made the US more democratic and made Americans more self-sufficient. The US was, for a long time, a country with a frontier. People could go out into the frontier and make new lives for themselves. They did not have to stay in the East and work for other people. They could go west and be their own bosses. This, it is said, helped make the US more democratic and it helped give Americans the idea that people are supposed to take care of themselves instead of waiting for others to take care of them.
Again, we should not forget that Native Americans and the people of Mexico were hurt by the westward expansion. However, if we look at this issue only from the point of view of the United States as a whole, westward expansion was a very good thing.
We’ve answered 319,627 questions. We can answer yours, too.Ask a question