1 Answer | Add Yours
The Treaty of Versailles ended World War I and had profound effects on the country of Germany. Because of this treaty, Germany would enter a severe recession, which provided fertile ground for the rise of fascism and the forces of Hilter’s Nazi Party.
By 1918 Germany had been beaten back behind their original borders. Instead of allow an allied invasion to push them back farther, they surrendered, and the four victorious allied powers hammered out a peace treaty known as the Treat of Versailles. The terms of this treaty were harsh and led to significant political changes for the country.
In the treaty, Germany was declared solely responsible for the war and forced to pay huge sums of money in reparations for the damage. This bankrupted the country and undermined the power of the ruling Weihmar Republic. Germany’s army was also significantly reduced in size, which made maintaining order in the postwar power struggle that was Germany even more difficult.
As a result of the treaty, Germany’s government was unable to rule effectively, emboldening political parties like the Communist and National Socialist Workers Party or Nazi’s as they were known. These groups usurped the power of the government, resulting in near anarchy in Germany for years as a new order tried to establish itself.
Eventually, the Nazi party was able to gain enough support to get their leader, Adolph Hitler, appointed to the Chancellorship. From there, he used Nazi party thugs to remove his political enemies from power and eventually established a totalitarian state. Without the Treaty of Versailles, this may not have ever happened.
We’ve answered 319,843 questions. We can answer yours, too.Ask a question