I'd suggest that European imperialism played a significant role in shaping American politics. Consider the Monroe Doctrine, as well as the Roosevelt Corollary to the Monroe Doctrine. One of the fundamental cornerstones of American foreign policy was shaped as a response to European influence in the Americas. Of course, it must be recognized that, during the early decades of the 1800's, the United States lacked the military strength to actually enforce this foreign policy claim, but the statement remained, by which the United States claimed in the Americas its own sphere of influence.
In addition, we should consider America's own imperialist adventures, beginning in the late 1800's. A critical moment in this history was the Spanish-American War, where the United States acquired overseas possessions from Spain. Consider also the Open Door Policy, through which the United States attempted to gain access to China (which was then divided into European zones of influence). Additionally, we can mention numerous US interventions in Latin America, as well as the construction of the Panama Canal (which was itself only guaranteed by a US military intervention in Panama's revolution against Colombia). Be aware that the United States has its own very real history of imperialism to contend with.