For the most part, I would argue, WWI had very little impact on colonies and colonialism. In theory, WWI would have helped to do away with colonialism, but in reality it is hard to argue that it actually did.
In a sense, one might expect that Woodrow Wilson’s 14 Points would have meant that WWI would have helped to end colonialism. For example, in Point XII, Wilson said that the areas that had been non-Turkish parts of the Ottoman Empire
should be assured an undoubted security of life and an absolutely unmolested opportunity of autonomous development.
This implies that they should have been allowed to become independent. If those areas were to become independent, then logically other areas should have the same chance.
However, this is not what happened. Instead, essentially everywhere that had been a colony before the war was still a colony in practice after the war. Many areas, like the former Ottoman territories or the former German territories in Micronesia became League of Nations mandates, but those areas were, in essence, simply colonies of the countries that were supposed to be running them for the League. Other places, like French Indochina and British India, simply remained in the hands of their pre-war colonial masters. All in all, it is hard to say that WWI had much of an impact on colonies and colonialism, regardless of what Wilson said in his 14 Points.