1 Answer | Add Yours
The most common answer to this is that the United States adopted a policy of isolationism in the years after World War I. However, this is not strictly correct. The true policy of isolationism did not come until the 1930s. During the 1920s, the US adopted a foreign policy that was meant to prevent war, but one which engaged with other countries in this attempt.
There were two major events that show that this was true. The first was the Washington Naval Conference and the treaties that arose from it. These treaties attempted to set the sizes of the great powers’ navies. This was meant to prevent the sort of massive arms race that helped bring WWI about. The second of these was the Kellogg-Briand Pact. This was a treaty signed in 1928 which committed the US and other signatories to renounce war. In other words, this treat outlawed war. These were efforts by the US to prevent another war, but they were not isolationist.
Isolationism only came later. This came about particularly in the 1930s. By then, it seemed clear that the efforts of the 1920s were not working. Because of this, many Americans gave up on the idea of preventing war among other countries and concentrated on keeping the US from getting dragged into any wars that occurred. This was seen most clearly in the Neutrality Acts of the 1930s.
Thus, US policy right after WWI was a policy of making treaties to prevent war. After that seemed to fail, the US turned to isolationism.
We’ve answered 319,205 questions. We can answer yours, too.Ask a question