European Colonization of North America

Start Free Trial

Was the United States ever part of the British Empire?

Expert Answers

An illustration of the letter 'A' in a speech bubbles

The American colonies were a vital part of the British Empire. These colonies were quite lucrative in terms of people and goods for the British. After the Revolutionary War, the United States was not truly a member of the British Empire in the sense that other colonies were; while other...

See
This Answer Now

Start your 48-hour free trial to unlock this answer and thousands more. Enjoy eNotes ad-free and cancel anytime.

Get 48 Hours Free Access

The American colonies were a vital part of the British Empire. These colonies were quite lucrative in terms of people and goods for the British. After the Revolutionary War, the United States was not truly a member of the British Empire in the sense that other colonies were; while other former colonies jumped at the chance to come to Britain's aid in various wars, the United States did not.

Commercially, the United States was treated as a favored former member of the British Empire. The United States and Britain traded freely after the American Revolution with few exceptions. In the buildup to the War of 1812, Britain tried to conscript American sailors who they believed left American ships; this led to the United States reasserting its independence in the War of 1812. The United States and Britain briefly shared the Oregon Territory until diplomats decided the property belonged to the United States during the Polk administration. The British royal house visited the United States during the reign of Queen Victoria with much American fanfare. While the United States was only a member of the British Empire during its colonial days, many Americans still feel close ties to the mother country.

Approved by eNotes Editorial Team