What impact did WWI have on colonies and colonialism? 

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

World War I had very little impact on the institution of colonialism as a whole.  It did have impacts on specific colonies, but it did not change the fact that colonialism was widespread nor did it lead to any serious change in the way that colonialism was carried out.

After WWI, many colonies changed hands. German colonies in Africa were split up and given to such countries as Belgium, Portugal, and Britain.  German territories in Micronesia were officially put under League of Nations mandates, but were essentially given to Japan.  Ottoman territories in the Levant were given to Britain on the same basis.  

What did not happen was any widespread independence.  Vietnam pressed for independence from France but did not get it.  India did not become independent of Great Britain.  This was a war that caused changes in what countries controlled what colonies, but it was not a war that seriously challenged the idea or practice of colonialism.

We’ve answered 317,830 questions. We can answer yours, too.

Ask a question