What were the consequences of WWI?
The major consequence of WWI was WWII.
After WWI, the Allied Powers imposed the Treaty of Versailles on Germany. This treaty humiliated Germany and made Germans very angry. It did this by, for example, taking away German territory and preventing Germany from having a strong military. It also weakened Germany by making Germany pay huge reparations to the Allies for the costs of the war.
As Germany was weakened and humiliated, Germans wanted in some way to get back to what they saw as their rightful status as a great country. This led them to support Hitler and his Nazi Party. Hitler's attempts to regain lost power (and then some) led to WWI.
Without WWI and the Treaty of Versailles, WWII would not have happened, at least not in the way that it did.