Better Students Ask More Questions.
Why did European states continue to extend their domination over world trade and...
2 Answers | add yours
There are two ways to think about this.
First, the European states continued to dominate because they could. European states (and their offshoots like the United States) were technologically much more advanced than any other states. They were technologically able to simply overwhelm countries like China and India. Later, they were able to dominate African countries as well.
Second, the European states continued to dominate because they wanted and needed to. These countries were rapidly industrializing. They felt that they needed empires to provide them with markets and with raw materials. They also felt that they needed to have empires in order to keep up with the other European states.
Thus, the European states continued to dominate in these ways both because they could and because they felt the need to do so.
Posted by pohnpei397 on March 14, 2013 at 12:09 AM (Answer #1)
A few key points are, first, there was a great deal of competition between European powers during the Age of Discovery but, ultimately, Great Britain became "Great" because they edged out all the competition.
The key to their power and influence was the superiority of the Royal Navy. They kept economic traffic flowing uninterrupted keeping revenue coming in at an increasing rate from the New World and other key markets. This was a form of immense power because Britain was small geographically speaking, and didn't have the rich resources or connections that many other leading European powers had.
The 18th century, however, was exhausting for Great Britain. They fought several wars and disputes, e.g. American Revolution, and entered the turn of the 19th century facing their greatest threat - Napoleon. This was a test of survival and they managed to come out ahead, and, much of this was due to the resolve of the British people but also the revenue that came in secured by the Royal Navy and economic imports.
This seminal event ingrained the idea of colonization, imperialism, and a British presence in every viable economic sector in the world and they did it successfully. It wasn't until the end of WWII that the British Empire "officially" came to an end. The British indeed perfected the application of empire, colonization, and imperialism, but not always with a devious ulterior motive as is popularly believed by many. In fact, in the 19th century, they believed it was their God-given mission and duty to transmit the best of British culture and language and religion (Christianity) to the fringes of the world. This was evident in Africa and SE Asia.
Posted by chrisberg on March 14, 2013 at 8:01 PM (Answer #2)
Join to answer this question
Join a community of thousands of dedicated teachers and students.