World history is filled with tales of conquest and foreign domination. Mercantile expansion in the fifteenth and sixteenth centuries, as well as a Western commitment to exploration, laid the groundwork for modern colonialism, which reached its height at the end of the nineteenth century. At that time, vast Western empires (such as the British and the French) reached around the earth, forcefully tying disparate cultures and societies to Western civilization. The age of empire officially ended after World War II, but colonialism has remained. First World countries, such as Great Britain, France, and the United States, continue to dominate Third World countries.