1 Answer | Add Yours
The main event that WWI caused was WWII.
After WWI, the Treaty of Versailles treated Germany very harshly. Germany was made to accept guilt for the war and they had to agree to pay heavy reparations to countries damaged by the war. They also had various of their territories taken away and were not allowed to have a full military.
All of this weakened Germany's economy and, at the same time, made Germans very resentful. The bad economy and the resentment led many Germans to support Hitler and his desire to get Germany back to what they saw as its rightful place in the world. This led to the beginning of WWII in Europe.
We’ve answered 318,989 questions. We can answer yours, too.Ask a question