The Spanish-American War expanded the US "empire" to a great degree, but it did not make the US an imperial country. That had already happened in the past.
To some extent, you can argue that the US was trying to be imperialistic as far back as the Monroe Doctrine, which essentially claimed all of the Americas as an American sphere of influence. You can also argue that the Mexican-American War was an exercise in imperialism. After that, the US took steps towards overseas imperialism before the Spanish-American War. The main example of this was when the US took control (though not officially) of Hawaii in 1887.
I would say, then, that this war expanded American imperialism but was not the first instance of US imperialism.