The United States did experience major social unrest during the post-war era, especially during the early years of the Depression. In many places, this came closer than we might imagine to revolution. This was true also of France and Britain. France, in particular, veered from far right to far left in the 1930s. I agree with post 3, however, that the desperate conditions in Germany, as well as an extreme sense of bitterness in the wake of World War I, paved the way for a radical response in Germany. Still, though, it didn't have to happen. Historians have emphasized a great deal of contingency in Hitler's rise to power, especially early on.