Although the United States was not involved in World War I for very long, the war did have significant impacts on the country’s foreign and domestic policies.
In foreign affairs, WWI pushed the US towards two types of policy. First, it pushed the US to try to promote policies that would prevent wars from happening again. President Wilson helped to create the League of Nations. The US participated in arms limitation talks with the Washington Naval Conferences of the 1920s. The US also signed the Kellogg-Briand Pact, making aggressive war illegal. These policies were brought about by a desire to avoid another terrible war.
In the 1930s, the US moved more towards isolationism. By now, the country felt that it had been dragged into WWI and that the war had not been in America’s best interests. In order to avoid having this happen again, the US withdrew from world affairs to some degree. This trend actually started earlier, when the US Senate refused to ratify the treaty that would have seen the US join the League of Nations. However, the major isolationist policies came about in the 1930s. During this decade, the US Congress passed laws forbidding various types of trade with countries that were at war. This was meant to prevent the US from getting excessively involved with any side in a war. In these ways, WWI impacted foreign policy in that it caused the US to work to avoid involvement in future wars.
In domestic policy, the war is generally credited with helping to end the Progressive Era. The Progressive Era was a time during which the government was undertaking all sorts of reforms. Historians say that the war sapped people’s appetite for reform. The war made them want a “return to normalcy,” as promoted by President Harding . After the stresses of war, people just wanted a society that would be calm and stable. This helped lead to the end of the reforms of the Progressive Era.
In these ways, WWI had an impact on foreign and domestic policy in the United States.