Has the US ever been imperialistic?



1 Answer | Add Yours

pohnpei397's profile pic

Posted on (Answer #1)

The United States has certainly been imperialistic at times in its past.  The most clear-cut example of such a time was the late 1800s and early 1900s.  

During this time, the United States took control over all or part or many other countries.  The US took control over what was then the kingdom of Hawaii.  It used the Spanish-American War to take control over Cuba and the Philippines.  It used its military power to take control of countries such as Haiti, the Dominican Republic, and Nicaragua.  It did not formally annex these countries, but it did have control over them for periods of time.   During this time, the United States was clearly acting as an imperial power.

We’ve answered 397,466 questions. We can answer yours, too.

Ask a question