Of the effects of World War II on the U.S., which one was the most important? 

1 Answer | Add Yours

Top Answer

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

There are at least three main candidates for this title.  Let us look at each.

The US became the world’s leading economic power.  World War II was great for the US economy. First, it got the economy out of the Great Depression.  Next, it caused the development of important new technologies.  Finally, and perhaps most importantly, it badly harmed every other major industrial power.  While England and France and Germany and Japan struggled to rebuild after the war, the US was untouched (in terms of its industrial capacity).  This made it the strongest economic power in the world.

The US became the superpower leading the “free world.”  After WWII, there were essentially two major powers in the world, the USSR and the US.  The war had destroyed or severely weakened the other major powers.  This meant that the US was cast in the position of the defender of democracy and capitalism against the Soviets.

US society became more educated and more suburbanized.  This was mainly due to the GI Bill.  The GI Bill allowed huge numbers of ex-servicemen to go to college.  They became the white collar workers in the companies that boomed as the US economy boomed.  This changed the face of the US forever as the country turned into a much more suburban and much more white collar/educated country than it had ever been.

Any of these could be seen as the most important effect of WWII on the US.

We’ve answered 318,955 questions. We can answer yours, too.

Ask a question