What are the main events that took place in American history in the 20th century?
Can you please talk about them in a chronological order. Plus, if you have any sources that can help me, please link them. Thank you very much in advance :)
There are, of course, way too many important events from the 20th century to list them all here and discuss them in any detail. I would argue that the most important ones include:
- The Progressive Era from 1900 to WWI. This was a time of major reforms that gave government more control over business and made the US somewhat more democratic.
- The Roaring '20s. This era brought the beginnings of a consumer culture. It also brought the first "culture wars" between traditionalists and people who wanted change.
- The Great Depression and New Deal.
- The prosperity of the 1950s. This made America the strongest and richest country in the world gave Americans the idea that they should always be the most powerful and their standard of living should always go up.
- The Vietnam War
- The social upheaval of the '60s.
- The rise of conservatism with Reagan in the '80s.
- The end of the Cold War
As for references, all of these are discussed at length here on eNotes. I have included three links as examples.