What effect did World War I have on American attitudes toward the outside world?

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

World War I caused Americans to be more isolationist and pacifist.  This would last until the late 1930s.

WWI soured Americans on foreign affairs.  They felt that they had been pulled into a war that was not really important to US interests.  Therefore, they hoped to remain isolated from foreign affairs, except when they took actions to try to prevent war.  These attitudes can be seen in such things as the Washington Naval Agreements (meant to prevent arms races that would lead to war) and the Neutrality Acts of the 1930s (meant to prevent the US from getting involved with countries that were at war.

In these ways, WWI changed American attitudes towards pacifism and isolationism.

We’ve answered 318,982 questions. We can answer yours, too.

Ask a question