What did the United States hope to gain from World War II?

Expert Answers

An illustration of the letter 'A' in a speech bubbles

I would argue that it is wrong to say that the US hoped to gain anything from World War II.  After all, the US was forced into this war.  It was not a war that the US chose to fight.  Once the US was in it, though, they hoped to...

See
This Answer Now

Start your 48-hour free trial to unlock this answer and thousands more. Enjoy eNotes ad-free and cancel anytime.

Get 48 Hours Free Access

I would argue that it is wrong to say that the US hoped to gain anything from World War II.  After all, the US was forced into this war.  It was not a war that the US chose to fight.  Once the US was in it, though, they hoped to gain an end to Japanese and German aggression and a more peaceful world order after the war.

The United States did not enter the war in order to gain anything.  Instead, the US entered the war in response to the Japanese attack on Pearl Harbor.  The US believed that it had to fight back once it was attacked in such a major way.

Once the US entered the war, however, it of course had war aims.  Its first aim was to force Germany and Japan to surrender unconditionally.  It wanted to break those two countries, making sure that they would not have enough power to start another war.  Its second aim was to try to create a world where war would be less likely.  This was one reason it wanted to defeat Japan and Germany in such a total way.  The US also wanted to create a better world order after the war.  For example, it wanted to create the United Nations to give countries a better way to interact with one another and to solve their disputes.

Thus, the US hoped to gain peace from WWII.  The country wanted to utterly defeat their enemies and then to set up a world order that would make later wars less likely to occur.

Approved by eNotes Editorial Team