What were the effects of WWII on the United States?
There were several effects of WWII on the United States, and I will summarize them below in terms of social and economic changes.
The biggest impact of WWII on the United States was economic stimulation. In order to prepare for war, industry experienced a boom and continued to grow after the war ended.
After the war ended, there was a shift from urban, city life back to rural life—this time, however, in the form of suburbs. Families, supported by a strong and growing economy, were able to start purchasing homes outside the city and rely on either public transportation or cars to commute to jobs. Women were also working more, a result of their efforts to work to help support American troops during wartime. With a transition to suburban life came the growth of mass consumerism, as a better economy and growth in technology and industry led to a desire for new goods. This was also the period where the "baby boomer" generation was born; in a stable, post-war United States, populations exploded.
It should be noted, however, that African Americans did not enjoy the same prosperity. Racial tensions and divides, segregation, and Jim Crow laws in the South among other things encouraged the Civil Rights Movement.
check Approved by eNotes Editorial