Better Students Ask More Questions.
What did the USA gain in World War I?
1 Answer | add yours
The United States was not in World War I for any kind of material or territorial gain. Therefore, it did not gain in any tangible way from fighting in the war. It got no new territory and no indemnities from the losers.
The US did gain, though, in two ways. First, it gained economically through its trade with and loans to Great Britain. Second, it gained in international prestige. Before the war, the US was really not much of a player on the international stage. By participating in the war and in the peace talks after the war, the US made itself a much more important country in terms of world affairs.
Posted by pohnpei397 on May 26, 2012 at 12:34 AM (Answer #1)
Related QuestionsSee all »
Join to answer this question
Join a community of thousands of dedicated teachers and students.