Expert Answers
pohnpei397 eNotes educator| Certified Educator

The major cause of WWII was that Germany and Japan, in particular, were unhappy with the way things were in the years after WWI.

Germany was very unhappy because it had lost WWI and had had harsh terms imposed on it by the Treaty of Versailles.  Germany felt that it should still be a major power and did not like having territory taken away and being refused the right to have a strong military (among other things).  These grievances made Germany want to change things.

Japan had not lost in WWI, but it felt the need for a larger empire.  It invaded China and wanted more.  It felt that the US would try to prevent it from getting the empire that it wanted and needed.  

The desire of Germany and Japan to change the world order (and the desire of countries like England and the US to prevent the changes) led to WWII.