From the beginning of the colonial experience and certainly from the creation of the United States, there was a struggle to create a sense of nationalism. Using plenty of details and examples to trace the evolution of this nationalism.
If your answer is simply pre-1877, then you need to also discuss the brand of expansionist democratic nationalism (i.e. Manifest Destiny) that culminated in the Mexican War, perhaps diplomatic developments like the Monroe Doctrine, and most importantly the establishment of primacy of national power, a major result of the outcome of the Civil War and Reconstruction. It's a huge topic, and one that you're probably better off not trying to trace in a linear fashion, but rather in terms of themes, most of which have already been mentioned in this thread.
For a long time, nationalism in the United States depended heavily on religion. One foundation of the Colonies was religious freedom, but settlers were uniformly Christian, even if of different factions. While the factions might squabble about small details, every upstanding citizen was some sort of Christian, and that allowed people a sense of unity.
But your question refers also to the time after the creation of the US. You need to look also at such things as the War of 1812 and the Louisiana Purchase, both of which helped to raise pride in America. You need to look at the Civil War, which broke down the ideas of nationalism. You need to look at the 20th century where various forces moved once again towards a feeling of national unity.
Before the revolution, the colonies were all separate entities. Some of them were even founded by different countries. The only sense of national identity came from being part of England, and from being in the rugged, isolated American landscape. When they began to feel that they had been treated badly by England, that brought them together.
I should talk about the period before 1877
Thank u so much that was helpful