What effect did World War I have on American attitudes toward the outside world?

Asked on by lro1979

2 Answers | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

World War I caused Americans to be more isolationist and pacifist.  This would last until the late 1930s.

WWI soured Americans on foreign affairs.  They felt that they had been pulled into a war that was not really important to US interests.  Therefore, they hoped to remain isolated from foreign affairs, except when they took actions to try to prevent war.  These attitudes can be seen in such things as the Washington Naval Agreements (meant to prevent arms races that would lead to war) and the Neutrality Acts of the 1930s (meant to prevent the US from getting involved with countries that were at war.

In these ways, WWI changed American attitudes towards pacifism and isolationism.

teachersage's profile pic

teachersage | (Level 2) Senior Educator

Posted on

Many Americans felt that World War I had been a waste of money and American lives, and that it had brought debt but no benefits to the country. The bipartisan response was to withdraw into isolationism, a longstanding U.S. policy since George Washington's Farewell Address. Following the war, the U.S. refused to join the League of Nations, passed immigration quotas, and increased tariffs on foreign goods entering the country. The military was kept small until the eve of World War II, when FDR started preparing for war. The majority sentiment up until Pearl Harbor was to let Europe deal with its own problems rather than to let the country be dragged into them. The end of World War II decisively changed that outlook. As the U.S. took on the mantle of world power after the war, most Americans decided that our presence in international affairs was important to safeguarding our interests.

Sources:

We’ve answered 319,833 questions. We can answer yours, too.

Ask a question