What were the consequences of WWI?

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

The major consequence of WWI was WWII.

After WWI, the Allied Powers imposed the Treaty of Versailles on Germany.  This treaty humiliated Germany and made Germans very angry.  It did this by, for example, taking away German territory and preventing Germany from having a strong military.  It also weakened Germany by making Germany pay huge reparations to the Allies for the costs of the war.

As Germany was weakened and humiliated, Germans wanted in some way to get back to what they saw as their rightful status as a great country.  This led them to support Hitler and his Nazi Party.  Hitler's attempts to regain lost power (and then some) led to WWI.

Without WWI and the Treaty of Versailles, WWII would not have happened, at least not in the way that it did.

We’ve answered 318,957 questions. We can answer yours, too.

Ask a question