What effect did World War I have on American attitudes toward the outside world?

Expert Answers
davmor1973 eNotes educator| Certified Educator

The First World War led to Americans becoming much more insular with regard to the outside world. There was nothing unusual about this. The growing mood of isolationism tapped into a long-held suspicion of "foreign entanglements," an American tradition going all the way back to Washington's Farewell Address. Many Americans resented being pulled into a conflict which they believed was no concern of theirs.

The United States had only entered the war in 1917, just over a year before the Armistice, and it was not immediately apparent to many that any particular benefits had accrued to the country from its participation. Noble talk from President Wilson about making the world safe for democracy cut little ice with many Americans. For them, the First World War was a conflict between European monarchies, the kind of old-fashioned political systems from which the first American settlers had fought so hard to escape in the Revolutionary War.

teachersage eNotes educator| Certified Educator

Many Americans felt that World War I had been a waste of money and American lives, and that it had brought debt but no benefits to the country. The bipartisan response was to withdraw into isolationism, a longstanding U.S. policy since George Washington's Farewell Address. Following the war, the U.S. refused to join the League of Nations, passed immigration quotas, and increased tariffs on foreign goods entering the country. The military was kept small until the eve of World War II, when FDR started preparing for war. The majority sentiment up until Pearl Harbor was to let Europe deal with its own problems rather than to let the country be dragged into them. The end of World War II decisively changed that outlook. As the U.S. took on the mantle of world power after the war, most Americans decided that our presence in international affairs was important to safeguarding our interests.

pohnpei397 eNotes educator| Certified Educator

World War I caused Americans to be more isolationist and pacifist.  This would last until the late 1930s.

WWI soured Americans on foreign affairs.  They felt that they had been pulled into a war that was not really important to US interests.  Therefore, they hoped to remain isolated from foreign affairs, except when they took actions to try to prevent war.  These attitudes can be seen in such things as the Washington Naval Agreements (meant to prevent arms races that would lead to war) and the Neutrality Acts of the 1930s (meant to prevent the US from getting involved with countries that were at war.

In these ways, WWI changed American attitudes towards pacifism and isolationism.

Access hundreds of thousands of answers with a free trial.

Start Free Trial
Ask a Question
Additional Links