What impact did WWI have on colonies and colonialism? 

Expert Answers
pohnpei397 eNotes educator| Certified Educator

World War I had very little impact on the institution of colonialism as a whole.  It did have impacts on specific colonies, but it did not change the fact that colonialism was widespread nor did it lead to any serious change in the way that colonialism was carried out.

After WWI, many colonies changed hands. German colonies in Africa were split up and given to such countries as Belgium, Portugal, and Britain.  German territories in Micronesia were officially put under League of Nations mandates, but were essentially given to Japan.  Ottoman territories in the Levant were given to Britain on the same basis.  

What did not happen was any widespread independence.  Vietnam pressed for independence from France but did not get it.  India did not become independent of Great Britain.  This was a war that caused changes in what countries controlled what colonies, but it was not a war that seriously challenged the idea or practice of colonialism.

Access hundreds of thousands of answers with a free trial.

Start Free Trial
Ask a Question