Why has the settlement of the west become such a powerful component of the central ideology of many Americans?
The reason for this is the fact that the settlement of the west fits in so well with the way in which many Americans see their country and themselves.
The settlement of the west (at least in our collective memory) shows Americans at their rugged and most individualistic best. As Americans settled the west, they did not need the government helping them or anything like that. They did it themselves, showing their ingenuity and bravery.
The settlement of the west was also (we believe) an act of bravery. It took brave people to face the climate and nature and the Indians. We like to think of the US as the "home of the brave" and this fits right in.
Finally, the settlement of the west shows how democratic we are. It was a time (we believe) when there were no elites and no powerful people. It was just individual Americans, all equal to one another, making their way west.
In these ways, the settlement of the west fits perfectly with our self image. It shows us as brave, individualistic people living in an egalitarian society.