The War of 1812 had serious impacts on the United States. Let us look at two of the most important of these.
First, there is the fact that this war helped to make the United States much more truly independent from Britain. After the war, the British stopped doing things like impressing American sailors, thus treating the US with the respect due a sovereign nation.
Second, the war helped to make Americans feel nationalistic. It helped them to feel more pride in their country because they had (at least in their minds) fought one of the most powerful countries in the world to a draw. This was a major source of pride for many Americans.