What did the west mean to the new United States?

Expert Answers
pohnpei397 eNotes educator| Certified Educator

Throughout the early part of American history, “the West” meant both opportunity and danger.

From the earliest days of the US, the West was a place where people could go in hopes of improving their lot in life.  The presence of the West, it is said, helped to make Americans more independent and more resourceful.  This was because they had to make it on their own in this relatively uncivilized environment.

The West was also, particularly in the early US, a source of danger.  There was always the fear of Indian attack.  This was especially true in the earliest days of the country when the British were still pushing the Indians to attack Americans.

In these ways, the West was a source of both opportunity and danger.