What did the west mean to the new United States?

Expert Answers info

pohnpei397 eNotes educator | Certified Educator

calendarEducator since 2009

write35,413 answers

starTop subjects are History, Literature, and Social Sciences

Throughout the early part of American history, “the West” meant both opportunity and danger.

From the earliest days of the US, the West was a place where people could go in hopes of improving their lot in life.  The presence of the...

(The entire section contains 127 words.)

Unlock This Answer Now


check Approved by eNotes Editorial