World War I Questions and Answers
by Edward Paice

Start Your Free Trial

What are some ways in which World War I changed America?

Expert Answers info

pohnpei397 eNotes educator | Certified Educator

calendarEducator since 2009

write35,413 answers

starTop subjects are History, Literature, and Social Sciences

This war changed America in many important ways.  Let us look at three of these ways.

It made America more isolationist.  Americans did not want to get involved in another war that was not, in their minds, important for American interests.  Therefore, they became much more isolationist in their foreign policy.

It helped bring about Prohibition.  One thing that helped get Prohibition passed was anti-German sentiment during WWI.  Many brewing companies were owned by German immigrants and alcohol could be associated with the enemy.  This helped encourage Americans to support Prohibition.

It started the migration of African Americans from the South.  Before the war, African Americans were largely rural, Southern people.  The war opened up jobs in factories in the North, beginning a process in which African Americans became more heavily Northern and urbanized. 

check Approved by eNotes Editorial