The impact of WWII was tremendous. WWII weakened Britain economically and militarily. In addition, it brought the United States to the fore as the most important country in the world, thus eclipsing Britain's political influence as well.
After WWII, Britain no longer had the economic or military power to easily hold down an empire. Rationing continued it England for years after the war. Britain was no longer able to pay for the military forces it would need to exert power in places like Greece and Turkey. (It was Britain's relinquishing of influence in those countries that led to the Truman Doctrine.) Britain was simply no longer as powerful as it had once been.
On top of this, the US was the most powerful country in the world. The US was not particularly supportive of colonialism and empire. Therefore, there was pressure on Britain to either decolonize (as with India) or to move towards that outcome (as with African colonies).
For these reasons, WWII had a tremendous impact on Britain's relationship with the empire.