"Imperialism" is a loaded and problematic term, notably due to the fact that some critics of American foreign policy use it to describe many American actions around the world since the late nineteenth century. Others use the term globalization, for example, as another term for American economic imperialism. Still, some scholars use "imperialism" to refer only to a isolated period in time—the late nineteenth and early twentieth centuries—while others would point to the American conquest of North America, which entailed the destruction of Native peoples and war with Mexico, as the epitome of imperialism.
Broadly defined, imperialism has played a major role in American economic expansion as American goods and investment flowed into countries around the world. Some might also argue that American imperialism has fostered the spread of democratic governments around the world, though this is a more contentious claim. There is no doubt, however, that these advances have come at a huge cost. In the late twentieth and twenty-first centuries, the enhanced global role that began with explicitly imperialist policies has brought the nation into nearly constant warfare in various locations around the world. However, perhaps the most important legacy of imperialism in the long run has been the fact that it is, indisputably, a direct contrast to stated American ideology. The United States was founded in an anti-imperial struggle, and its imperialist actions around the world—perhaps most notably in the Philippines in the early twentieth century—have made it the imperial power, denying autonomy to peoples who looked to the Declaration of Independence, and other assertions of human liberties, as inspiration for their own freedom.