History Questions and Answers

Start Your Free Trial

What were the effects of WWII on the United States?

Expert Answers info

Kelli Beard, M.A. eNotes educator | Certified Educator

briefcaseTeacher (K-12)

bookB.A. from Rutgers University-New Brunswick/Piscataway

bookM.A. from Rutgers University-New Brunswick/Piscataway


calendarEducator since 2014

write164 answers

starTop subjects are History, Literature, and Social Sciences

There were several effects of WWII on the United States, and I will summarize them below in terms of social and economic changes.

Economic Changes

The biggest impact of WWII on the United States was economic stimulation.  In order to prepare for war, industry experienced a boom and continued to grow after the war ended.

Social Changes

After the war ended, there was a shift from urban, city life back to rural life—this time, however, in the form of suburbs.  Families, supported by a strong and growing economy, were able to start purchasing homes outside the city and rely on either public transportation or cars to commute to jobs.  Women were also working more, a result of their efforts to work to help support American troops during wartime.  With a transition to suburban life came the growth of mass consumerism, as a better economy and growth in technology and industry led to a desire for new goods.  This was also the period where the "baby boomer" generation was born; in a stable, post-war United States, populations exploded.

It should be noted, however, that African Americans did not enjoy the same prosperity. Racial tensions and divides, segregation, and Jim Crow laws in the South among other things encouraged the Civil Rights Movement.

Further Reading:

check Approved by eNotes Editorial