What events did WWI cause?Not what caused WWI, but what it caused.

Expert Answers
pohnpei397 eNotes educator| Certified Educator

The main event that WWI caused was WWII.

After WWI, the Treaty of Versailles treated Germany very harshly.  Germany was made to accept guilt for the war and they had to agree to pay heavy reparations to countries damaged by the war.  They also had various of their territories taken away and were not allowed to have a full military.

All of this weakened Germany's economy and, at the same time, made Germans very resentful.  The bad economy and the resentment led many Germans to support Hitler and his desire to get Germany back to what they saw as its rightful place in the world.  This led to the beginning of WWII in Europe.

Access hundreds of thousands of answers with a free trial.

Start Free Trial
Ask a Question