What political changes occurred in the United States as a result of declaring war on Germany in WWI?

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

The major political change that occurred in the US as a result of entry into World War I was a reduction in the amount of political freedom that American citizens enjoyed.

During the time that the US participated in WWI, the government was very intolerant of dissent.  The most prominent example of this intolerance was the episode in which Eugene V. Debs, a prominent political figure, was jailed for urging people to resist the draft.  This sort of infringement on the Americans' civil liberties was the most important political change that came with US involvement in WWI.

We’ve answered 317,598 questions. We can answer yours, too.

Ask a question