This is a great question. Racism, through the genocide of indigenous people of what is now known as the US, and the enslavement of African peoples, was absolutely foundational to the creation of America. The foundation of the United State's economy was built on slavery and US domestic expansion was made through genocide. So, certainly, America's change from isolationism to interventionism and imperialism abroad is absolutely not a surprising development. For, one must remember, it was the initial imperialism by the European settlers of indigenous people that led to the creation of the United States. When the United States's military gained control of Guam, the Philippines, and Puerto Rico, the same racist policies that had been exercised domestically translated to global racism and imperialism.