What effect did World War II have on the political dispensation of America?  

What effect did World War II have on the political dispensation of America? 

 

Expert Answers
rrteacher eNotes educator| Certified Educator

World War II affected American politics in complex ways. For one thing, it essentially ended the strong isolationist strain in American politics, relegating it to the fringes of the American political spectrum. The war, along with the Great Depression, also created what has often been called a "liberal consensus" that the government had a major role to play in the economic well-being of its citizens. Most Democrats and even most Republicans supported such programs as the GI Bill and others that promoted economic expansion. The war also created increasing momentum for civil rights for African-Americans, who served with distinction in the armed forces and achieved significant economic gains at home. Finally, the end of the war brought about the Cold War, which altered the landscape of American politics in profound ways. The nation was put on an anti-communist footing after World War II, a condition that led many Americans to accept and even embrace the rapid expansion of the American military, the development of increasingly destructive weapons, and the emergence of what President Dwight Eisenhower, speaking in 1960, called the "military-industrial complex." These are just a few of the complex and often contradictory changes in postwar American politics.

Access hundreds of thousands of answers with a free trial.

Start Free Trial
Ask a Question