Homework Help

How did the United States change after WWII?How did the United States change after WWII?

user profile pic

legerdemain | eNoter

Posted March 1, 2011 at 2:52 PM via web

dislike 3 like
How did the United States change after WWII?

How did the United States change after WWII?

Tagged with discussion, history

11 Answers | Add Yours

user profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted March 1, 2011 at 3:00 PM (Answer #2)

dislike 0 like

There are so many ways in which the United States changed after WWII that it is hard to pick the most important one, let alone to list all the ways in which the country changed.

One candidate for the most important change would be that the US became much more prosperous in the time after the war.  Before the war, the US was a rich country, but just one of many.  In addition, it was struggling because of the Depression.  After the war, the US was without any rival the richest country in the world since all its competitors had been devastated by the war.

The fact that the US was by far and away the richest country in the world impacted it in many ways.  For example, it allowed the country to do things like enacting the GI Bill, sending millions of people to college and expanding the country's human resources.  As another example, it allowed Americans to feel that they had the almost God-given right to lead the world and to have constant economic growth.  This has formed the basis of the American outlook for the last 60 years.

There are many other ways in which the US changed, but the increased prosperity is perhaps the most important of them.

user profile pic

akannan | Middle School Teacher | (Level 3) Distinguished Educator

Posted March 1, 2011 at 3:08 PM (Answer #3)

dislike 0 like

The emergence of the United States as a world superpower helped transform its perception in the world.  Europe had been decimated by the war, nations in Asia, Africa, and South America that had been under colonial control were free to pursue their own destinies.  In the end, it is this absolute change that distinguished the United States from other nations in the world.  The Soviet Union was the other superpower, but the capitalist, liberal United States was seen as the preeminent nation on the world stage.  Domestically, there was an unbounded enthusiasm about the promise and possibilities of "the American Dream."  The soldiers that came home after beating Hitler and the Axis of Evil believed in the idea of being able to provide a life of material comforts to women who awaited their return.  The economic boom of post- war America allowed the nation to define their dreams of the future in a material sense and this helped to change how Americans viewed themselves and their world.

user profile pic

lmetcalf | High School Teacher | (Level 3) Senior Educator

Posted March 1, 2011 at 5:17 PM (Answer #4)

dislike 0 like

After WWII the United States was the supreme military power of the world, especially owing  to the fact that we had atomic weapons as we demonstrated in Japan.  That power created a rivalry with the USSR, and we were pretty quickly thrust into the Cold War with a country that was our ally not even a decade before.  The thought of nuclear holocaust was a subtle shadow over a lot of the promise and prosperity that dominated the lives of most Americans in the 1950's and 1960's.

user profile pic

litteacher8 | Middle School Teacher | (Level 1) Distinguished Educator

Posted March 1, 2011 at 6:26 PM (Answer #5)

dislike 0 like

One of the greatest changes in the United States was the most of the women had served in the workplace in one way or another during the war, when the men were off fighting.  Many did not return to be housekeepers after the war, and some were competing with men returning home.  Another change also had to do with the structure of the economy.  Almost everyone was involved in the war effort in some way, and we had grown quite efficient.  That efficiency was put into mass production of consumer goods on a scale not seen before the war.

user profile pic

accessteacher | High School Teacher | (Level 3) Distinguished Educator

Posted March 2, 2011 at 3:32 AM (Answer #6)

dislike 0 like

I think one aspect that is worthy of comment is the way that the US became much more involved in European affairs, and in particular how the seeds of a special relationship with the United Kingdom were formed thanks to the US's involvement in the War. This of course is something that has been much deplored in Britain in recent years, especially as Blair was portrayed as Bush's poodle, following him into an invasion of Iraq and Afghanistan after 9/11.

user profile pic

mshurn | College Teacher | (Level 1) Educator Emeritus

Posted March 2, 2011 at 7:16 AM (Answer #7)

dislike 0 like

The post-war explosion in the production of consumer goods created an explosion in the advertising industry to sell them which led to an increased desire to own them which led to more production and diversification/competition among various brands. This emphasis on materialism led to the creation of an economy run on credit, essentially unknown before the war. Prior to World War II, most families borrowed money only to buy a house; many people saved their money to buy a car. The credit card represents a major change that occurred in the post-World War II boom.

user profile pic

catd1115 | Middle School Teacher | (Level 3) Assistant Educator

Posted March 3, 2011 at 1:28 PM (Answer #8)

dislike 0 like

Wow what a huge question! I think you can see from the above posts that there are many answers and that it is an important question historically. I think the biggest change is in Americans perception of themselves and their country. As mentioned above, we have women who don't neccessarily want to be put back in kitchens, but also the rise of suburbs due in part to the new prosperity and the GI bill. Americans began to see themselves as consumers and as the trendsetters in all things. In addition American became THE superpower and of course led to the Cold War. It is not just about the events that took place in postwar America, it is about the attitude that Americans began to have about themselves, that most of us retain today. The idea that America (and Americans) are special and unique continues as a result of our successes in WWII and our prosperity during the postwar period.

user profile pic

lhc | Middle School Teacher | (Level 3) Educator

Posted April 29, 2011 at 5:33 AM (Answer #9)

dislike 0 like

Ironically, the America that went into World War II, licking its wounds from years of economic depression, emerged from World War II with an unprecedented desire for and access to consumer products, not the least of which was the boom in home ownership.  The modern suburb and all its attendant positives and negatives is rooted in the post World War II era; the beginning of advertising as a multi-million dollar industry aimed at selling not just products, but lifestyles to Americans, emerged from this period, as did the concept of consumer credit. 

user profile pic

brettd | High School Teacher | (Level 2) Educator Emeritus

Posted July 14, 2011 at 11:53 AM (Answer #10)

dislike 0 like
Long an industrial power, after World War II, the US became hyper-industrialized, the world leader in manufacturing for what would be a very long time.  Our population became permanently more mobile, as our people had been shuffled in large numbers to the west and north, and were truly never a sedentary nation again.  United in war, the country afterwards remained united against communism.  Large numbers of African-Americans moved to major cities in the north and west, and the US economy grew the middle class to its largest and highest standard of living to date.
user profile pic

Yojana_Thapa | Student , Grade 10 | eNoter

Posted January 25, 2014 at 8:59 PM (Answer #11)

dislike 0 like

WWII had a huge impact in the United States because it came out as the world's superpower. The need of weapons took us out of the Great Depression.U.S economy grew after WWII. Women had a huge impact in the WWII as well. Because of all this involvement, WWII was crucial. America prospered!

user profile pic

parama9000 | Student , Grade 11 | Valedictorian

Posted January 30, 2014 at 8:41 AM (Answer #12)

dislike 0 like

It emerged as a superpower and women suffrages were brought into prominence, not just in US.

Join to answer this question

Join a community of thousands of dedicated teachers and students.

Join eNotes