World War I most definitely did not end in lasting peace. In fact, WWI is often seen as one of the major causes of World War II, which was to start only 21 years after the end of WWI.
After the end of WWI, the Allied Powers imposed the Treaty of Versailles on Germany. That treaty had many harsh provisions that made the Germans very angry and which hurt their economy.
Partly because of German anger at the treaty, Germany soon started trying to break the treaty and return to power. This helped Hitler come to power in Germany and it helped lead to World War II.