The Aftermath of World War II

Start Free Trial

What effect did World War II have on the political dispensation of America?   

Expert Answers

An illustration of the letter 'A' in a speech bubbles

World War II affected American politics in complex ways. For one thing, it essentially ended the strong isolationist strain in American politics, relegating it to the fringes of the American political spectrum. The war, along with the Great Depression, also created what has often been called a "liberal consensus" that the government had a major role to play in the economic well-being of its citizens. Most Democrats and even most Republicans supported such programs as the GI Bill and others that promoted economic expansion. The war also created increasing momentum for civil rights for African-Americans, who served with distinction in the armed forces and achieved significant economic gains at home. Finally, the end of the war brought about the Cold War, which altered the landscape of American politics in profound ways. The nation was put on an anti-communist footing after World War II, a condition that led many Americans to accept and even embrace the rapid expansion of the American military, the development of increasingly destructive weapons, and the emergence of what President Dwight Eisenhower, speaking in 1960, called the "military-industrial complex." These are just a few of the complex and often contradictory changes in postwar American politics.

See eNotes Ad-Free

Start your 48-hour free trial to get access to more than 30,000 additional guides and more than 350,000 Homework Help questions answered by our experts.

Get 48 Hours Free Access
Approved by eNotes Editorial Team