12 Answers | Add Yours
One of the greatest changes in the United States was the most of the women had served in the workplace in one way or another during the war, when the men were off fighting. Many did not return to be housekeepers after the war, and some were competing with men returning home. Another change also had to do with the structure of the economy. Almost everyone was involved in the war effort in some way, and we had grown quite efficient. That efficiency was put into mass production of consumer goods on a scale not seen before the war.
The emergence of the United States as a world superpower helped transform its perception in the world. Europe had been decimated by the war, nations in Asia, Africa, and South America that had been under colonial control were free to pursue their own destinies. In the end, it is this absolute change that distinguished the United States from other nations in the world. The Soviet Union was the other superpower, but the capitalist, liberal United States was seen as the preeminent nation on the world stage. Domestically, there was an unbounded enthusiasm about the promise and possibilities of "the American Dream." The soldiers that came home after beating Hitler and the Axis of Evil believed in the idea of being able to provide a life of material comforts to women who awaited their return. The economic boom of post- war America allowed the nation to define their dreams of the future in a material sense and this helped to change how Americans viewed themselves and their world.
Ironically, the America that went into World War II, licking its wounds from years of economic depression, emerged from World War II with an unprecedented desire for and access to consumer products, not the least of which was the boom in home ownership. The modern suburb and all its attendant positives and negatives is rooted in the post World War II era; the beginning of advertising as a multi-million dollar industry aimed at selling not just products, but lifestyles to Americans, emerged from this period, as did the concept of consumer credit.
Wow what a huge question! I think you can see from the above posts that there are many answers and that it is an important question historically. I think the biggest change is in Americans perception of themselves and their country. As mentioned above, we have women who don't neccessarily want to be put back in kitchens, but also the rise of suburbs due in part to the new prosperity and the GI bill. Americans began to see themselves as consumers and as the trendsetters in all things. In addition American became THE superpower and of course led to the Cold War. It is not just about the events that took place in postwar America, it is about the attitude that Americans began to have about themselves, that most of us retain today. The idea that America (and Americans) are special and unique continues as a result of our successes in WWII and our prosperity during the postwar period.
The post-war explosion in the production of consumer goods created an explosion in the advertising industry to sell them which led to an increased desire to own them which led to more production and diversification/competition among various brands. This emphasis on materialism led to the creation of an economy run on credit, essentially unknown before the war. Prior to World War II, most families borrowed money only to buy a house; many people saved their money to buy a car. The credit card represents a major change that occurred in the post-World War II boom.
I think one aspect that is worthy of comment is the way that the US became much more involved in European affairs, and in particular how the seeds of a special relationship with the United Kingdom were formed thanks to the US's involvement in the War. This of course is something that has been much deplored in Britain in recent years, especially as Blair was portrayed as Bush's poodle, following him into an invasion of Iraq and Afghanistan after 9/11.
After WWII the United States was the supreme military power of the world, especially owing to the fact that we had atomic weapons as we demonstrated in Japan. That power created a rivalry with the USSR, and we were pretty quickly thrust into the Cold War with a country that was our ally not even a decade before. The thought of nuclear holocaust was a subtle shadow over a lot of the promise and prosperity that dominated the lives of most Americans in the 1950's and 1960's.
There are so many ways in which the United States changed after WWII that it is hard to pick the most important one, let alone to list all the ways in which the country changed.
One candidate for the most important change would be that the US became much more prosperous in the time after the war. Before the war, the US was a rich country, but just one of many. In addition, it was struggling because of the Depression. After the war, the US was without any rival the richest country in the world since all its competitors had been devastated by the war.
The fact that the US was by far and away the richest country in the world impacted it in many ways. For example, it allowed the country to do things like enacting the GI Bill, sending millions of people to college and expanding the country's human resources. As another example, it allowed Americans to feel that they had the almost God-given right to lead the world and to have constant economic growth. This has formed the basis of the American outlook for the last 60 years.
There are many other ways in which the US changed, but the increased prosperity is perhaps the most important of them.
After WW2 the specific outcomes were; The league of nations became established and made seperate peace with Germany. The defeat of the German empire and new independence was stripped from its European holdings as well as the Ottoman empire losing its holdings. (Info found:My World History Notebook)
It emerged as a superpower and women suffrages were brought into prominence, not just in US.
WWII had a huge impact in the United States because it came out as the world's superpower. The need of weapons took us out of the Great Depression.U.S economy grew after WWII. Women had a huge impact in the WWII as well. Because of all this involvement, WWII was crucial. America prospered!
We’ve answered 318,996 questions. We can answer yours, too.Ask a question