Homework Help

What happened to the Native Americans after the Americans settled the West?

user profile pic

questionare123 | Student, Grade 10 | eNotes Newbie

Posted September 27, 2012 at 3:18 AM via web

dislike 1 like

What happened to the Native Americans after the Americans settled the West?

1 Answer | Add Yours

user profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted September 27, 2012 at 3:27 AM (Answer #1)

dislike -1 like

After the Americans settled the West, the Native Americans were pushed off their land and onto reservations.  Once they were on reservations, the American government tried hard to make the Native Americans assimilate into American ways of life. 

As American settlers moved west, they came onto Native American lands.  They wanted the lands and so the government pushed the Indians off.  Often, settlers then wanted lands that had been given to the Indians in treaties and so the Indians were pushed off again.  Once they were fully confined to the reservations, the government implemented programs like the boarding school program to try to “kill the Indian to save the man.”

Join to answer this question

Join a community of thousands of dedicated teachers and students.

Join eNotes