What are some ways in which World War I changed America?

Expert Answers
pohnpei397 eNotes educator| Certified Educator

This war changed America in many important ways.  Let us look at three of these ways.

It made America more isolationist.  Americans did not want to get involved in another war that was not, in their minds, important for American interests.  Therefore, they became much more isolationist in their foreign policy.

It helped bring about Prohibition.  One thing that helped get Prohibition passed was anti-German sentiment during WWI.  Many brewing companies were owned by German immigrants and alcohol could be associated with the enemy.  This helped encourage Americans to support Prohibition.

It started the migration of African Americans from the South.  Before the war, African Americans were largely rural, Southern people.  The war opened up jobs in factories in the North, beginning a process in which African Americans became more heavily Northern and urbanized. 

Access hundreds of thousands of answers with a free trial.

Start Free Trial
Ask a Question
Additional Links