The difference between colonialism and post-colonialism has often stymied us. When we hear people talk about post-colonial theory or post-colonial thought, we have been known to go: “What do you mean by ‘post-colonial’?"
So let's just keep it basic and specific. Colonialism refers to the practice when countries formally took partial or complete control over another country, brought in its own people, and exploited that country’s citizens and resources for its own economic prosperity.
When we think of colonialism, we typically think of race. We think of a white, European country—like England, France, or Germany—taking over another country of black or brown people. For example, Algeria was once a French colony, and India was once a British colony.
After World War II, in 1949, the United States president Harry Truman declared,
The old imperialism—exploitation for foreign profit—has no place in our plans. What we envisage is a program of development based on the concepts of democratic fair-dealing.
With leaders touting democracy, formal colonialism began to fade. Former colonies, like Algeria and India, fought their colonizers and won their independence.
Of course, this did not mean the end of the United States or European countries trying to dominate countries of mostly black and brown people. However, what began to emerge was white countries exerting their power in a different way.
That's where post-colonialism comes in handy. It's a way of noting that these countries no longer exert their control in the explicit context of colonies, yet they still continue to weld power in ways that are often harmful, deadly, and exploitative.
Post-colonialism also provides a framework to review the colonial legacy proper and investigate and interrogate its ideology and how those ideas and policies continue to impact and shape our world today.