How can we describe the United States as a post-colonial society?

Expert Answers
enotechris eNotes educator| Certified Educator

The United States in the course of its short history has gone from colony to colonizer.  One could define acts of colonization moving into French and Spanish / Mexican controlled areas on the continent (causing Texas to fight for freedom and eventually causing war with Mexico) or later crossing the waters to "influence" Hawaii, The Philippines, and Cuba.  These last two are significant, as they were colonial conquests outside of the contiguous States gained during the 1898 Spanish-American War.  Indeed, in the early 1800's, the Monroe Doctrine could be interpreted as the States stating that they alone would colonize the Western Hemisphere through "Manifest Destiny,” although both were more or less a bluff until the latter part of the century.  

In late ancient times, Rome, the supreme power of the known world, kept order, established trade, made civic improvements, and culture thrived under the "Pax Romana” after colonial conquest. Today, for the forseeable future, whether we like it or not, whether we continue to violate our Founding Principles in the process, we have become "The Policeman of the World," with a few current notable exceptions, and have instituted a worldwide "Pax Americana.”

We could therefore conclude that these United States are indeed a post-colonial society, since there's nothing left on Earth to colonize!  Yet it soon could be a Chinese flag joins the Stars and Bars on the Moon, and other worldly establishments created, and colonization resumes again.

desdamona | Student

Colonies are societies/countries who are ruled by a different society or country. Colonialism is used to describe the period of history when European countries controlled much of the rest of the world for the purposes of resources, trade and political advantage. These European countries had Empires. America began life as a British Colony and was part of The British Empire. Louisiana and Dixieland was a French colony.

After independence, America was no longer a colony, so, simplistically you could say that it is a post-colonial society. We talk about post-colonial Africa and post-colonial India to describe countries which have thrown of their colonial masters.

But America is more than that. America dominates the world, but generally speaking it does not have colonies. Amercia has found a more advanced way to control other countries. Many people have talked about The American Empire, but unlike The British Empire it does not exist geographically. Most modern historians consider America's post colonial empire is financial in nature. While it is obviously defended using military strength, its normal method of control is economic domination. America's 'colonies' are post colonial in nature because they are not occupied by administrators and soldiers, they are occupied by businessmen and products.