What were some consequences of the War of 1812?

2 Answers | Add Yours

larrygates's profile pic

larrygates | College Teacher | (Level 1) Educator Emeritus

Posted on

The Treaty of Ghent which ended the War of 1812, made no mention of the practice of Impressment, over which the war was fought. In essence, the Treaty re-established the Status Quo Ante Bellum (the way things were before the war) as was often the case in European Wars.

Although militarily, the United States accomplished very little--in fact many historians consider it to be a loss for the U.S.-- it did have other important consequences:

  • Andrew Jackson's victory at the Battle of New Orleans made him a war hero and exceptionally popular. It was this victory that made him a household word, and eventually propelled him to the Presidency. During his campaign for President, his slogan to compare himself with the more academic John Q. Adams was "Adams can write; but Jackson can fight."
  • The war marked the demise of the Federalist Party creating a one party nation. Many Federalists had opposed the war, and participated in the famous Hartford Convention which demanded several amendments to the Constitution as a condition of remaining in the Union. After the war, many Americans considered this unpatriotic, and the Federalist Party ceased to exist.
  • The intense nationalism that evolved because of the War led to the "Era of Good Feelings," in which only one political party existed,the Democratic Republicans; and the virtually unanimous re-election of James Monroe as President. Monroe received all but three electoral votes. One elector voted for John Q. Adams, as he believed the honor of unanimous election should belong to George Washington alone. Two others abstained, ostensibly for the same reasons.
Sources:
pohnpei397's profile pic

pohnpei397 | College Teacher | (Level 3) Distinguished Educator

Posted on

The War of 1812 had serious impacts on the United States.  Let us look at two of the most important of these.

First, there is the fact that this war helped to make the United States much more truly independent from Britain.  After the war, the British stopped doing things like impressing American sailors, thus treating the US with the respect due a sovereign nation.

Second, the war helped to make Americans feel nationalistic.  It helped them to feel more pride in their country because they had (at least in their minds) fought one of the most powerful countries in the world to a draw.  This was a major source of pride for many Americans. 

We’ve answered 318,912 questions. We can answer yours, too.

Ask a question