1 Answer | Add Yours
It is perhaps overstating things to say that the 1940s were thedecade of change. There have been other decades that saw major changes to rival those of the '40s. However, it was a decade of very important changes.
The most important changes came out of World War II. They affected both foreign and domestic affairs in the US. On the foreign side, the 1940s saw the beginning of the Cold War and the atomic age. This combination was to dominate the international scene for the next 40 years and more. Domestically, the 1940s were the start of a huge change in American society. The end of the war brought the GI Bill and the beginning of a major increase in the wealth of the average American. As America got richer, suburbanization started. America was on its way to the prosperous and staid society of the 1950s that would then give rise to the turmoil of the 1960s.
In these ways and many others, the 1940s was a decade of important changes in America.
We’ve answered 317,500 questions. We can answer yours, too.Ask a question