What happened to the Native Americans after the Americans settled the West?

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

After the Americans settled the West, the Native Americans were pushed off their land and onto reservations.  Once they were on reservations, the American government tried hard to make the Native Americans assimilate into American ways of life. 

As American settlers moved west, they came onto Native American lands.  They wanted the lands and so the government pushed the Indians off.  Often, settlers then wanted lands that had been given to the Indians in treaties and so the Indians were pushed off again.  Once they were fully confined to the reservations, the government implemented programs like the boarding school program to try to “kill the Indian to save the man.”

We’ve answered 318,980 questions. We can answer yours, too.

Ask a question