World War I most definitely did not end in lasting peace. In fact, WWI is often seen as one of the major causes of World War II, which was to start only 21 years after the end of WWI.
After the end of WWI, the Allied Powers imposed the Treaty of Versailles on Germany. That treaty had many harsh provisions that made the Germans very angry and which hurt their economy.
Partly because of German anger at the treaty, Germany soon started trying to break the treaty and return to power. This helped Hitler come to power in Germany and it helped lead to World War II.
No. In time to come, historians will view the period from 1914-1945 as the first and second acts of the world at war. The reasons we had World War II are due to the failures from the peace of World War I, namely the flawed Versailles Treaty and League of Nations. Both world wars had the same combatants, that generally allied with the same partners during both acts. The argument can be made that had World War II not occurred, the Versailles Treaty would have been a success, and that a lasting peace could be insured. However, the main reason for the failure of Versailles was that many of the European combatants emerged from World War I with the idea of preserving their various empires; people living in the far-flung colonies as well as within Europe itself were not content to be dominated by the major powers. The treaty did little to address the issue of Nationalism; it merely deferred the issue to the next generation, spreading the seeds of discord that ripened into warfare within 20 years. At the end of World War II, the issue of Nationalism was solved by the wholesale destruction of those same empires.