What did the colonies win after the Revolutionary War?

1 Answer | Add Yours

pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

After the Revolutionary War, the colonies (now the United States) won two main things.  First, they won independence.  Second, they won a defined territory that would become their country.

The most important thing that the United States won after the war was their right to exist as a country.  Before the war, they were, of course, colonies that were ruled by Great Britain.  They did not have the right to control their own destiny or to govern themselves.  After the war, they got these rights.  They were now an independent nation, which was the main point of the war.

The other important thing that the new country won was a territory to call its own.  There can be no such thing as a country without territory.  In the treaty that ended the war, the colonies won the land (with the exception of Florida) that is now the part of the United States east of the Mississippi River.  Although the British did not actually vacate all of this land right away, it was land that officially belonged to the United States.

After the war, then, the United States won the right to exist as an independent country with a territory of its own.

Sources:

We’ve answered 318,912 questions. We can answer yours, too.

Ask a question