More so then the lessons, it is the history books from the past that have shown the Native Americans to be savages that were taken over by a more civilized population that was here to help them become apart of the "new" America. Today, History books are getting better with focusing on that fact that Native Americans were pushed off their land and were not given a choice in the matter. As far as elementary school, I remember lessons really focusing on the culture of the Native Americans and appreciating the good relationship they did have with many colonists.
I certainly think the depiction offered is a better and more healthy one. There was a point in teaching history when Native Americans were presented as caricatures. Either they were depicted in a manner that reflected a stereotype that was shallow in its rendering of Native Americans as "exotic" individuals who lived off of the land and killed buffalo or there was a silence offered. The latter is probably more frightening in my mind because this silence helped to justify the belief that Westward Expansion was almost a type of birthright of Americans, and one were resistance was shown to be non- existent. I think over the last 15 years in particular, there has been a stronger and more focused depiction as the recognition of the challenges within the American dynamic. For every advance and step towards greatness, American history must answer for its treatment of Native Americans and the denial of their rightful share of what it means to partake in the advancement and riches in America. There has been a stronger element relayed to younger children that speaks of Native Americans possessing their own culture and their own sense of identity and forces younger students to place this in a paradigm that forces examination of what happened to this sense of identity and why it happened the way it did.