What political changes occurred in the United States as a result of declaring war on Germany in WWI?

Expert Answers
pohnpei397 eNotes educator| Certified Educator

The major political change that occurred in the US as a result of entry into World War I was a reduction in the amount of political freedom that American citizens enjoyed.

During the time that the US participated in WWI, the government was very intolerant of dissent.  The most prominent example of this intolerance was the episode in which Eugene V. Debs, a prominent political figure, was jailed for urging people to resist the draft.  This sort of infringement on the Americans' civil liberties was the most important political change that came with US involvement in WWI.