This question assumes that the treatment of Germany in the aftermath of World War I was unfair, and I would guess that there are those that would disagree with this characterization. That being said, there is no question that the treatment of Germany played a large part in the occurrence of World War II.
To begin, Germany, after World War I, was forced to essentially accept blame for the war and pay incredible amounts in reparations. This led to great resentment in Germany, and it also created financial difficulties for its economy, which was strapped from having to pay all of this money. There was a deliberate intention to crush Germany and punish it.
There was also an intention to see to it that Germany could not rise again as a war power, and Germany was forced to agree to cut its military force and do away with its air force entirely, along with having to give up submarines.
Germany also lost part of its territory, which was given to the winners, approximately 13% of the country lost all told. This, too, created humiliation and harmed the economy.
These festering resentments set the stage for the election of Hitler, who promised new national pride and an improved economy, and who, when elected, began to secretly build up the military once again. The French and the British, who were near enough to enforce the treaty against the military buildup, were quite lax about its enforcement, and thus Hitler was able to create a war machine.
Whether or not Germany's treatment in the aftermath of World War I was unfair, it is clear that a different approach, perhaps at the very least helping Germany get back on its feet economically, might have gone a long way towards keeping the peace. Instead, Germany, humiliated and impoverished, with land stripped from it and given to others, was highly motivated to wage war once again.