What did the USA gain in World War I?
1 Answer | Add Yours
The United States was not in World War I for any kind of material or territorial gain. Therefore, it did not gain in any tangible way from fighting in the war. It got no new territory and no indemnities from the losers.
The US did gain, though, in two ways. First, it gained economically through its trade with and loans to Great Britain. Second, it gained in international prestige. Before the war, the US was really not much of a player on the international stage. By participating in the war and in the peace talks after the war, the US made itself a much more important country in terms of world affairs.
Join to answer this question
Join a community of thousands of dedicated teachers and students.Join eNotes