Better Students Ask More Questions.
What were the causes of WWII?
1 Answer | add yours
The major cause of WWII was that Germany and Japan, in particular, were unhappy with the way things were in the years after WWI.
Germany was very unhappy because it had lost WWI and had had harsh terms imposed on it by the Treaty of Versailles. Germany felt that it should still be a major power and did not like having territory taken away and being refused the right to have a strong military (among other things). These grievances made Germany want to change things.
Japan had not lost in WWI, but it felt the need for a larger empire. It invaded China and wanted more. It felt that the US would try to prevent it from getting the empire that it wanted and needed.
The desire of Germany and Japan to change the world order (and the desire of countries like England and the US to prevent the changes) led to WWII.
Posted by pohnpei397 on May 24, 2012 at 1:28 PM (Answer #1)
Related QuestionsSee all »
Join to answer this question
Join a community of thousands of dedicated teachers and students.