Did WWII have anything to do with the decolonisation of the British empire?

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

World War II had everything to do with the decolonization of the British Empire.  There were at least three reasons for this.

First, the war was tremendously damaging to Britain in terms of its economy.  Britain was in rather dire straits economically for some time after the war.  This made it much harder for it to maintain its empire.

Second, the war damaged British international prestige.  Britain had been on the winning side in the war, but it had had to rely on its Allies to do most of the work of winning.  Britain itself had experienced humiliating defeats at the hands of the Japanese.  This made it so other countries were less likely to accept the idea of a British Empire.

Finally, the war gave the United States a dominant position in the international order.  The United States was not generally supportive of colonial empires.  Therefore, it put pressure on Britain to decolonize as well.

In these ways, as this link says, the collapse of the British Empire "can be traced directly to the impact of World War Two."

Sources:

We’ve answered 315,859 questions. We can answer yours, too.

Ask a question