On balance, did 19th century imperialism aid or harm the societies involved? Why, and how? Did some groups gain or lose signifigantly more or less than others? Which groups, in each or either case? How and why?
9 Answers | Add Yours
On balance, I'll go with the idea that it harmed the societies involved. I will argue that it hurt those who got imperialized. It took away their freedom and prevented them from developing in their own ways. It also hurt the imperializers. If nothing else, it hurt them morally by encouraging them to think of themselves as better than those they conquered. It damaged them in ways very well set out by George Orwell in "Shooting an Elephant."
While I agree that Imperialism offered far more negative effects than positive effects and functioned as merely another form of enslavement, I would like to offer a few positive points here. Imperialism helped smaller colonies develop and modernize. The mother nations brought to the smaller colonies irrigation systems, such as canals; transportation systems, such as roads and railways; education opportunities, such as schools; communication systems, such as the telegraph and newspapers; and a structured economy.
There were some negative and positive effects, but I think the negative ones have far outlasted the positive in some places. In other places, the opposite is true. Look at Hawaii. It's doing fine. Some countries in Africa, like Somalia, are very much struggling.
I would support those who identified that there were both positive and negative effects regarding 19th century imperialism. Smaller societies were able to advance (some of them) and others were simply taken to allow the superior nation to grow in regards to territory.
I am not convinced that many of the benefits of "modernization" ascribed to imperialism were actually benefits. That privileges a western point of view and a notion of progress that I'm not sure colonialized people would have agreed with.
What gains were made were usually made by an elite, usually identified and cultivated by Europeans, and often this had devastating consequences that are still felt today.
Overall, I view imperialism as a net negative for most colonial people involved. It was not simply a way to increase territory, but a way to extract wealth from peoples and resources around the world to fuel the growth of industrial capitalism.
If you want to get a very up-to-date discussion of this topic, take a look at Niall Ferguson's latest book, The West and the Rest. This is just the latest of several books by Ferguson in which he suggests that there were actually some benefits to imperialism, such as the spread of modern medicine. He does not deny the evils of imperialism (in fact, he stresses them quite strongly), but he does try to present a more nuanced picture than has sometimes been presented.
I particularly like the idea (expressed above) that imperialism was bad for the imperialists because it encoruaged in them a kind of unfortunate arrogance. In the long run, the world would seem a better place if imperialism did not exist, although it seems to have existed in various ways almost since the beginning of time.
I think in retrospect you have to say that countries that were affected by imperialism in the nineteenth century are worse off.
Africa and South America were under the yoke of Europe during the 19th Century. These areas, as a whole, have remained economically dependant and most countries still rank as third world nations, particularly in Africa. Politically these same areas have been hotbeds of unrest and violence throughout the 20th century and to today.
If we take the long view perspective I don't think we can argue that any of the "improvements" made during the imperial period have had positive lasting affects. IF so, why are these the same areas still struggling the most today?
I think that if we look at the "native" societies involved, we can clearly see that overall, unfortunately imperialism had an adverse impact on those nations. This is because the ways in which imperialism represented an attack on traditional culture and values, and replaced them with values and a culture that do not fit the context. We are still seeing the results of this imperialistic endeavour in the world now.
All right, but apart from the sanitation, medicine, education, wine, public order, irrigation, roads, the fresh water system and public health, what have the Romans ever done for us?
I guess the question regarding imperialism is if the natives of a particular area actually wanted these benefits. If imposed by the imperial power, clearly that would violate their sovereignity; however, are there any cases where imperial powers were welcomed with open arms?
Join to answer this question
Join a community of thousands of dedicated teachers and students.Join eNotes