Did World War I have a greater impact on American society than World War II?
World War II had a much greater impact on United States society than World War I for several reasons. First, our involvement in supplying the Allied war effort, which started long before we declared war, came at the end of a Great Depression, the worst economic crisis in US history. As the federal government began buying massive amounts of war materials, such as guns and planes to ship overseas, factories needed workers. Full employment returned, and this finally ended the devastating depression. American society became more vigorous, happy, and hopeful as people had money to spend or, after war rationing began, to save.
Second, while the US was the ascending world power at the start of the 20th century, WWII made that fact obvious. By the end of the war, Great Britain passed the superpower baton to the US. The 20th century became the "American century." American society was victorious, confident, and prosperous. We had defeated not just an enemy, but what was understood as evil incarnate, and we felt very good about what we stood for and who we were.
WWI did have an impact on US society, bringing in women's rights and, for a decade, prosperity, but its impact was not as long lasting or profound as that of WWII.
World War II had a much greater impact on American society than World War I did.
One major impact of WWII was the movement of women into the labor force. This happened much more in WWII than in WWI because the war went on longer and drew more men into the Armed Forces. Another major impact of WWII was technological. This war involved much more technology than WWI did, bringing American society things like air travel. Finally, WWII had a much greater effect on the American psyche. It consumed the US for four years, changing every aspect of life for a much longer time than WWI did. Its impact on Americans’ attitudes lasted well beyond the end of the war.