A variety of answers can be given to this broad question. It truly depends on one's own vision of historical significance and personal interests. Personally, I would identify World War II, the civil rights movement, and the Great Depression as three of the most important things one could learn about United States history since 1877.
To begin, World War II impacted the entire world in a way that is still being felt today. National borders, ally agreements, and military spending are just a few of the major changes that WWII brought on. Moreover, it shaped the way the United States was viewed around the globe. The US was able to take on a larger leadership role, starting with the rebuilding of Europe.
The civil rights era is arguably the most important time in American social history. Nearly 100 years after the Civil War, African Americans were not treated as equal under the law. Peaceful protests and several landmark Supreme Court decisions changed the character of the US. With Jim Crow...
(The entire section contains 2 answers and 573 words.)