When one considers the role of race and racism in Southern life and the settlement and conquest of the U.S. west, was the pursuit of empire in 1898 really such a departure from U.S. political traditions – or did leaving American shores mark a real change?

Expert Answers

An illustration of the letter 'A' in a speech bubbles

This is a great question. Racism, through the genocide of indigenous people of what is now known as the US, and the enslavement of African peoples, was absolutely foundational to the creation of America. The foundation of the United State's economy was built on slavery and US domestic expansion was made through genocide. So, certainly, America's change from isolationism to interventionism and imperialism abroad is absolutely not a surprising development. For, one must remember, it was the initial imperialism by the European settlers of indigenous people that led to the creation of the United States. When the United States's military gained control of Guam, the Philippines, and Puerto Rico, the same racist policies that had been exercised domestically translated to global racism and imperialism.

See eNotes Ad-Free

Start your 48-hour free trial to get access to more than 30,000 additional guides and more than 350,000 Homework Help questions answered by our experts.

Get 48 Hours Free Access
Approved by eNotes Editorial