The lure of economic interests and increasing spheres of influence around the world helped the United States become an imperial power. Many believe that part of the reason why the United States engaged in the practice of imperialism was due to the idea of increasing economic gain. For example, when industrialization generated so much income within the United States, there was a natural inclination to expand these wealth generating initiatives outside its borders in other countries. Business interests in these nations had to be protected, and the use of the military and expanding governmental policy to incorporate these holdings as part of a political initiative helped to increase imperialism. This idea of dollar diplomacy moved the United States into Imperialism and its practices.
This is open to historical interpretation, of course, but the short answer is financial gain, emerging nationalism and racism.
By the late 1800's the United States had expanded all the way across the continent and many of the resources had been claimed or developed. Robber barons of the time put pressure on the government to allow them to expand overseas into colonies, and to help secure those lands and then protect them.
The United States was also quite proud of itself in that time frame, believing that America could look European powers eye to eye on the world stage and more than hold its own. Follow the link on the Open Door Note for a good example. The colonies we took were quite strategic, and served as great forward bases for troops and our navy.
American society at the time also was quite racist. There were laws on the books against immigration by Asians. Social tensions with the immigrant groups that had already arrived was common, and many believed the white race, the English language, and the Christian religion were superior, and that we should bring those things to other parts of the world - whether they wanted it or not.