What are some ways in which World War I changed America?


World War I

1 Answer | Add Yours

pohnpei397's profile pic

Posted on (Answer #1)

This war changed America in many important ways.  Let us look at three of these ways.

It made America more isolationist.  Americans did not want to get involved in another war that was not, in their minds, important for American interests.  Therefore, they became much more isolationist in their foreign policy.

It helped bring about Prohibition.  One thing that helped get Prohibition passed was anti-German sentiment during WWI.  Many brewing companies were owned by German immigrants and alcohol could be associated with the enemy.  This helped encourage Americans to support Prohibition.

It started the migration of African Americans from the South.  Before the war, African Americans were largely rural, Southern people.  The war opened up jobs in factories in the North, beginning a process in which African Americans became more heavily Northern and urbanized. 

We’ve answered 287,694 questions. We can answer yours, too.

Ask a question