Is Hawaii a colony of the U.S.?

3 Answers

bullgatortail's profile pic

bullgatortail | High School Teacher | (Level 1) Distinguished Educator

Posted on

Because the founding fathers disdained their own position as colonies of Great Britain, they decided to use the term "states" when they declared independence from England. The United States has never called any of its possessions "colonies" for this reason; instead, we refer to non-states as territories. Many of the Western states began as territories, and the U. S. still maintains many such affiliates today. Puerto Rico, for example, is a territory whose people have repeatedly voted to maintain their present status. It would certainly be an ideal candidate for a 51st state sometime in the future. The U. S. Virgin Islands, Guam and American Samoa are other American territories today.

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

No, Hawaii is not a colony of the United States.  In fact, those of us who are from Hawaii tend to get pretty annoyed when people talk about leaving Hawaii to go to the United States.  We call the rest of the US "the Mainland..."

Anyway, Hawaii is actually one of the states of the United States.  It has been a state now since it became the 50th (and last) state of the Union in 1959.

Before that, Hawaii was a territory (so I guess you could call it a colony) of the US.  The US got it in the 1890s by overthrowing the last queen of Hawaii, Queen Liliuokalani.

william1941's profile pic

william1941 | College Teacher | (Level 3) Valedictorian

Posted on

Hawaii was a destination for American and European businesses as the soil and climate was perfect to grow fruit and sugarcane. These businesses had large plantations.

Hawaii had been an independent monarchy but to protect and expand their business interests a coup was carried out by forces backed by US businesses and the US military. The monarchy was replaced and political control over the island passed into the hands of powerful business empires of the US.

In 1900 Hawaii become an official territory of US and in 1959 it was made the 50th state of the US.