1 Answer | Add Yours
It can certainly be argued that the turn to imperialism by the United States in the late nineteenth century was of a piece with expansionist policy that dated back to the founding of the nation. Many of the justifications were similar, including a civilizing mission, expanding economic opportunities, competition with foreign powers, and others. Additionally, the anxieties that resulted from the closing of the frontier (most famously expressed in Frederick Jackson Turner's 1893 essay The Significance of the Frontier in American History) are often cited as driving forces for continued expansion as a way of maintaining American vitality in a modern world.
Yet imperialism marked a major departure from previous US policy. For more than a century, political leaders had emphasized American isolation from the problems of the world. By mid-century, however, the desire for new markets for United States manufactured goods had resulted in increasing involvement in foreign affairs. This trend began most conspicuously in 1854, when Commodore Matthew Perry concluded the Convention of Kanagawa, which forcibly opened Japanese markets to trade.
Over the next fifty years, the United States, driven by fears of industrial overproduction, moved increasingly toward a policy that emphasized the acquisition of secure markets, and a military policy that sought naval bases around the world. This policy, in many ways, was taken to its extreme conclusion with the Spanish-American War, and the acquisition of the Philippines and Puerto Rico, as well as frequent interventions on behalf of US investments in Central America before World War I.
We’ve answered 334,140 questions. We can answer yours, too.Ask a question