Some people say that U.S. imperialism did not end in the late 19th/early 20th century but that America is still imperialistic today. Do you agree or disagree? Why or why not
I do believe that the United States is still imperialistic, although in a more directly economic way than in the late 19th and early 20th centuries.
One could make an argument that we have no business holding Puerto Rico or Guam as territories, unless, of course, they are strategically important for maintaining the empire.
You could also say that a prime motivation for both wars with Iraq is to protect an oil supply the American economy depends on.
Fareed Zakaria, a noted author and host of a talk show on CNN, once said: "America loves democracy in countries that are strategically irrelevant."
I do not agree. America still does seek to have a global influence and to try to encourage other countries to be like it is. However, America no longer tries to take territory or to control it in the way that the US did 100 or so years ago.
To see how this is the case, let us look at the way we handled Cuba after the Spanish American War as opposed to the way that we are handling Iraq and Afghanistan today. With Cuba, we passed the Platt Amendment and essentially gave ourselves the right to control Cuba's policies even though they were nominally independent. By contrast, after we fought wars to overthrow governments in Iraq and Afghanistan, we are allowing those countries to rule themselves. Yes, we try to influence them, but we certainly are not controlling the policies of either country.
So, we do still try to have influence, but not in any way that can truly be called imperialistic.