I would argue that it advanced the American national interest significantly. It gave the US a great deal of territory, territory which turned out to be very valuable. Just think about what California has meant to the US. Besides the Gold Rush it has been and remains very important to US agriculture and to many other industries. Gaining California was very much in the national interest and the US gained much else besides.
Taking this territory did help to bring on the Civil War, but that war was likely to happen no matter what. In addition, the war ended slavery, which is surely a good thing as well.
The US entered the Mexican War for what we would see as immoral reasons, but it is hard to argue that the war was not in the interests (in the long term) of the country.
The Mexican war was attributed to America's expansionist agenda. There were vast unoccupied lands in North America in areas like Georgia and Louisiana but Texas and California still appeared lucrative. The Mexicans conditionally accepted settlers from the North and as their numbers swelled the economic potential of the region continued to unravel. Americans sought to take away the privileges enjoyed by the locals and instead impose centralized authority that would be beneficial to them.
The Americans made major gains after the war as mentioned especially with regards to agriculture, mining, industry and commerce. The Mexicans were deliberately provoked into a war which they were ill prepared for and their attempts to protect their territory failed. This led to the easy annexation of Texas, settlement in California and New Mexico with minimal cost to the Americans.