I am assuming you mean the debate within the United States at that time where imperialism was concerned. This was a central theme of the late 1800s as America went to war with Spain and added colonies in both hemispheres for the first time.
This in favor of imperialism argued that America needed to compete on the world stage with the European empires, and that it would be better if America took those colonies than, say, Britain or Germany. They argued that we needed the resources and that white America was racially and religiously superior.
Anti-imperialists argued that expanding to colonies only benefited the rich robber barons, not the average American. They argued that our country was founded on anti-imperialist sentiment, so such a policy was undemocratic, and that we were forgetting our roots. They believed American blood should not be shed in order to benefit a relative few.