1 Answer | Add Yours
One change was that the nature of the war dealt a serious (but not fatal) blow to American isolationism as a mainstream political view. American public opinion in the wake of the war was far more friendly to involvement in geopolitics, particularly participation in alliance systems (like NATO) and international organizations (like the United Nations. Part of this was due to the war itself, which was won by the Allies in no small part due to the efforts of the United States, especially in the Pacific. Part was due to the obvious need for global leadership in the devastation and turmoil wrought by the conflict. Yet another reason was the outbreak of the Cold War, itself a legacy of World War II.
The war, as well as the Great Depression, also altered Americans' views of the role of government in the economy and in people's everyday lives. The New Deal had been the single largest expansion of the federal government in American history, but the war effort dwarfed Roosevelt's peacetime programs both in expense and in extent. Massive investments in industry during the war were not totally abandoned in its wake, and the postwar period saw some continuity in government social programs, most conspicuously the GI Bill of Rights.
Finally, the war, fought as it was against a totalitarian regime that based its rule on racial tenets, helped to accelerate the discussion about civil rights in the United States. Black soldiers who had served abroad chafed under Jim Crow at home, and many white liberals changed their thinking about white supremacy. Of course, this sparked a massive backlash from whites, especially in the South, but historians widely consider the war as a watershed moment in race relations in the United States.
We’ve answered 317,491 questions. We can answer yours, too.Ask a question