European Colonization of North America

Start Free Trial

Was the United States ever part of the British Empire?

Expert Answers

An illustration of the letter 'A' in a speech bubbles

The American colonies were a vital part of the British Empire. These colonies were quite lucrative in terms of people and goods for the British. After the Revolutionary War, the United States was not truly a member of the British Empire in the sense that other colonies were; while other former colonies jumped at the chance to come to Britain's aid in various wars, the United States did not.

Commercially, the United States was treated as a favored former member of the British Empire. The United States and Britain traded freely after the American Revolution with few exceptions. In the buildup to the War of 1812, Britain tried to conscript American sailors who they believed left American ships; this led to the United States reasserting its independence in the War of 1812. The United States and Britain briefly shared the Oregon Territory until diplomats decided the property belonged to the United States during the Polk administration. The British royal house visited the United States during the reign of Queen Victoria with much American fanfare. While the United States was only a member of the British Empire during its colonial days, many Americans still feel close ties to the mother country.

See eNotes Ad-Free

Start your 48-hour free trial to get access to more than 30,000 additional guides and more than 350,000 Homework Help questions answered by our experts.

Get 48 Hours Free Access
Approved by eNotes Editorial