What did the USA gain in World War I?
2 Answers | Add Yours
The United States was not in World War I for any kind of material or territorial gain. Therefore, it did not gain in any tangible way from fighting in the war. It got no new territory and no indemnities from the losers.
The US did gain, though, in two ways. First, it gained economically through its trade with and loans to Great Britain. Second, it gained in international prestige. Before the war, the US was really not much of a player on the international stage. By participating in the war and in the peace talks after the war, the US made itself a much more important country in terms of world affairs.
World War I led to intangible gains for the US. Although the US gained no new territories or other materialistic benefits, it gained the stature of an industrial and economic giant.
The US emerged as one of the main superpowers in terms of military strength, economic power, and industrial output after the world war. With all of Europe and Russia engulfed in war, the US was temporarily nonpartisan and was able to maintain business with all sides. With Britain putting an embargo on German supplies and increasing material needs to sustain the war, the allies looked to the US to fulfill their needs, which the US mostly did. The result was thus gaining favor and respect from the allies.
Join to answer this question
Join a community of thousands of dedicated teachers and students.Join eNotes