Editor's Choice
Did World War I impact American society more than World War II?
Quick answer:
Both World War I and World War II significantly impacted American society, but in different ways. World War I led to women's suffrage and spurred civil rights awareness among African Americans. It also shifted America's global perspective. World War II saw women entering the workforce permanently, advanced civil rights movements, and initiated the Cold War era. It also fueled economic growth and suburbanization. Both wars were transformative, but World War II had more enduring societal changes.
Both World War I and World War II had profound short- and long-term impacts on American society. To answer the question, it is best to look at the societal impacts of each of these devastating conflicts, and then you can decide which of them in your opinion had the greater impact.
America entered World War I late and was only involved in active fighting for less than a year, and yet the conflict spurred numerous societal changes. One of the biggest and longest lasting was the granting of women the right to vote. For a long time women's suffrage had been a hot issue, but when women began assisting heavily in the war effort at home, they demanded the right to vote. President Woodrow Wilson declared voting rights for women a "necessary war effort" and persuaded Congress to pursue it. As a result, the 19th Amendment to the Constitution was...
Unlock
This Answer NowStart your 48-hour free trial and get ahead in class. Boost your grades with access to expert answers and top-tier study guides. Thousands of students are already mastering their assignments—don't miss out. Cancel anytime.
Already a member? Log in here.
ratified just after the war in 1920.
After World War I, African Americans who had been overseas, realizing the disparity of inequality compared to other countries when they returned home, increased their demands for civil rights.
Americans were intolerant of Germans during and after World War I, so much so that many citizens downplayed their German ancestry.
World War I also caused American to look outward beyond its borders and consider the welfare of the world. America went to war on principle, to "keep the world safe for democracy," and this attitude affected American attitudes and foreign policy for decades.
During World War II, women once again took over numerous jobs on the home front while men joined the military and went overseas. However, when the war was over, instead of reverting to their previous domestic roles, many women went on to join the workforce on a permanent basis.
The hard work and heroism of African Americans during World War II brought about increased civil rights struggles, leading to bans on discriminatory practices in federal agencies and desegregation of the military shortly after the war. Native Americans, Hispanic-Americans, and other minorities also performed heroically during the war and afterwards increased their struggles for equal rights.
The close of World War II brought about the initiation of the Cold War and anticommunism in the United States, which shaped society for a long time to come. Many servicemen returned home and received their education as a result of the GI Bill. A strong postwar economy resulted in the growth of suburbia and the baby boom. Suburbia, in turn, caused a migration of whites to the suburbs, while inner city populations were mainly composed of African Americans and other minorities.
As we can see, then, both wars greatly affected American society, and the changes brought about by World War I eventually led to further changes during and after World War II.
References