When one considers the role of race and racism in Southern life and the settlement and conquest of the U.S. west, was the pursuit of empire in 1898 really such a departure from U.S. political traditions – or did leaving American shores mark a real change?

Expert Answers

An illustration of the letter 'A' in a speech bubbles

This is a great question. Racism, through the genocide of indigenous people of what is now known as the US, and the enslavement of African peoples, was absolutely foundational to the creation of America. The foundation of the United State's economy was built on slavery and US domestic expansion was...

Unlock
This Answer Now

Start your 48-hour free trial to unlock this answer and thousands more. Enjoy eNotes ad-free and cancel anytime.

Start your 48-Hour Free Trial

This is a great question. Racism, through the genocide of indigenous people of what is now known as the US, and the enslavement of African peoples, was absolutely foundational to the creation of America. The foundation of the United State's economy was built on slavery and US domestic expansion was made through genocide. So, certainly, America's change from isolationism to interventionism and imperialism abroad is absolutely not a surprising development. For, one must remember, it was the initial imperialism by the European settlers of indigenous people that led to the creation of the United States. When the United States's military gained control of Guam, the Philippines, and Puerto Rico, the same racist policies that had been exercised domestically translated to global racism and imperialism.

Approved by eNotes Editorial Team