What happened to the Native Americans as whites settled the West ?

Expert Answers
pohnpei397 eNotes educator| Certified Educator

As whites settled the American West, Native Americans were pushed off of their ancestral lands and confined to reservations.  This process was often accompanied by fighting between the Native Americans and the US Army.  After this process was over, the US started to try to assimilate Native Americans by destroying their culture.

At first, US policy towards the Native Americans involved only forcing them onto reservations.  This was often done through violence.  It typically put the Native Americans on marginal lands that could not support them, particularly after the buffalo herds had been devastated by white hunters.  The Native Americans then often became dependent on Indian agents for their support.

After the process of putting the Indians on reservations was complete, the US moved to a policy of assimilation.  Under this policy, the goal was to "kill the Indian to save the man."  That is, the government aimed to eradicate Indian culture and make the Indians become more like whites.  This was symbolized most clearly by the boarding school system that was meant to take young Indians away from their tribes and Americanize them.

iklan100 | Student

a resonable answer. for native american resistace to such expansion and other broader, linked themes, also see below please.