This is a great question. There are so many thing that could be stated. So, I will make two important points about the impact of Vietnam.
First, the impact of the war was great from a social point of view. In fact, I would say that this was one of the watershed moments of modern American history, because the people of America, who at one point held the government in esteem (especially after World War II), now saw the self-interest of the American government. All of this could be seen in the countless protests. Even in the collective mind of Americans today, the Vietnam war connotes only negative images. This was a huge blow to the state and America has since then never completely recovered.
Second, what made things worse was how America conducted the war. For example, they used chemicals, such as Agent Orange, which still have effects on the local population with birth defects and other horrible aliments. In addition, it should be noted that American soliders also were affected and the company that produced the chemical had to settle out of court with these soldiers, who were experiencing physical problems after the war. The heroine problem by the early 1970s was also became a huge problem and many people saw this as a direct outflow of the war.
In a word, the government lost legitimacy in the eyes of many people.