I would argue that the most important aspect of WWII, when it comes to decolonization, was the loss of power by the United Kingdom, France, and other European countries. These countries lost their aura of power and they also lost their role as world leaders. Both of these factors made it harder for them to keep their colonies.
In WWII, France, Belgium, and the Netherlands were all defeated by Germany. Britain was defeated in Asia by Japan. These countries had held their colonies, in part, by seeming to be powerful. Their aura of power had made it so that their colonial subjects tended to accept that colonization was inevitable. When these countries were defeated (especially when Britain was defeated by a country made up of non-white people), the aura of power eroded and the colonized people were less willing to accept their fate.
WWII also led to a change in world leadership. France and Britain lost their status as great powers and the United States became the sole leader of the free world. This helped bring about decolonization in at least two ways. First, it weakened France and Britain, making it much harder for them to afford the expenditures required to hold their empires. Second, since the US was largely (not completely, as it supported French control over Vietnam) opposed to colonialism, it put pressure on France and Britain to eventually set their colonies free.
Thus, WWII helped bring about decolonization largely because it weakened European colonial powers, making it harder for them to maintain their empires.