There have been so many important moments in US history and world history since the end of WWI. My top five moments would include:
- The beginning of WWII. World War II was perhaps the most important event of the entire 20th century.
- The dropping of the atomic bombs on Japan. This is important more for the fact that it started the nuclear age than for the fact that it ended the war. Nuclear weapons are a symbol of the Cold War which was so pivotal in world history.
- The passage of the 1964 Civil Rights Act. If you are asking about US history, this stands in for the societal changes that happened in the '60s. The most notable of these was the change in race relations in the US.
- The fall of the Berlin Wall. This symbolized the end of the Cold War.
- The 9/11 terrorist attacks. The attacks created the world we now live in with the War on Terror and the tensions between the Muslim world and the West.
For more events from the various decades, please follow the link below.