Absolutely, the West still influences us today. The idea of the Wild West has helped to create the culture and attitudes that we still hold.
The main way in which we see this is in the idea that we are a country of rugged individualists. We think that the pioneers "tamed" the West on their own without help from the government and without rules that infringed on their freedoms. Because of this, we think that this is how we still can and should be. The idea of the West helps to make us believe that everyone can make it on their own if only they try hard enough. It helps make us think that we shouldn't need regulations to keep us safe. It helps make us think that we don't need help from anyone--if we're just tough enough and work hard enough, we'll succeed.
So the real historical legacy of the American West is in our attitudes about who we are and what we should be like.