What were the consequences of World War I for the United States?


World War I

1 Answer | Add Yours

pohnpei397's profile pic

Posted on (Answer #1)

World War I was not as consequential for the US as World War II, but it was important nonetheless.  Among its effects were:

  • It made the US more of an important world power.  The fact that US participation swung the outcome of the war helped make the US a first rate power after the war.
  • It also pushed the US towards isolationism.  Many Americans felt that the war had been unnecessary and did not want to get involved with further foreign wars.
  • It helped lead to intolerance in the US.  This happened both during and after the war.  During the war, propaganda helped create animosity towards Germans.  After the war, there was a “red scare” that came about in part because of the war.
  • It helped bring African Americans to the North.  African Americans moved north to work in factories making war materiel.  This started to change the African American experience.


We’ve answered 396,523 questions. We can answer yours, too.

Ask a question