What did the west mean to the new United States?

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

Throughout the early part of American history, “the West” meant both opportunity and danger.

From the earliest days of the US, the West was a place where people could go in hopes of improving their lot in life.  The presence of the West, it is said, helped to make Americans more independent and more resourceful.  This was because they had to make it on their own in this relatively uncivilized environment.

The West was also, particularly in the early US, a source of danger.  There was always the fear of Indian attack.  This was especially true in the earliest days of the country when the British were still pushing the Indians to attack Americans.

In these ways, the West was a source of both opportunity and danger.

We’ve answered 317,709 questions. We can answer yours, too.

Ask a question