What happened to the Native Americans after the Americans settled the West?

Expert Answers
pohnpei397 eNotes educator| Certified Educator

After the Americans settled the West, the Native Americans were pushed off their land and onto reservations.  Once they were on reservations, the American government tried hard to make the Native Americans assimilate into American ways of life. 

As American settlers moved west, they came onto Native American lands.  They wanted the lands and so the government pushed the Indians off.  Often, settlers then wanted lands that had been given to the Indians in treaties and so the Indians were pushed off again.  Once they were fully confined to the reservations, the government implemented programs like the boarding school program to try to “kill the Indian to save the man.”