Better Students Ask More Questions.
What are some ways in which World War I changed America?
1 Answer | add yours
This war changed America in many important ways. Let us look at three of these ways.
It made America more isolationist. Americans did not want to get involved in another war that was not, in their minds, important for American interests. Therefore, they became much more isolationist in their foreign policy.
It helped bring about Prohibition. One thing that helped get Prohibition passed was anti-German sentiment during WWI. Many brewing companies were owned by German immigrants and alcohol could be associated with the enemy. This helped encourage Americans to support Prohibition.
It started the migration of African Americans from the South. Before the war, African Americans were largely rural, Southern people. The war opened up jobs in factories in the North, beginning a process in which African Americans became more heavily Northern and urbanized.
Posted by pohnpei397 on October 23, 2012 at 7:57 PM (Answer #1)
Related QuestionsSee all »
Join to answer this question
Join a community of thousands of dedicated teachers and students.