What happened to the Native Americans as whites settled the West ?

2 Answers | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

As whites settled the American West, Native Americans were pushed off of their ancestral lands and confined to reservations.  This process was often accompanied by fighting between the Native Americans and the US Army.  After this process was over, the US started to try to assimilate Native Americans by destroying their culture.

At first, US policy towards the Native Americans involved only forcing them onto reservations.  This was often done through violence.  It typically put the Native Americans on marginal lands that could not support them, particularly after the buffalo herds had been devastated by white hunters.  The Native Americans then often became dependent on Indian agents for their support.

After the process of putting the Indians on reservations was complete, the US moved to a policy of assimilation.  Under this policy, the goal was to "kill the Indian to save the man."  That is, the government aimed to eradicate Indian culture and make the Indians become more like whites.  This was symbolized most clearly by the boarding school system that was meant to take young Indians away from their tribes and Americanize them.

We’ve answered 317,824 questions. We can answer yours, too.

Ask a question