I would say that Progressivism did make an impact on America because it changed how capitalism was viewed. Much of Progressivism dealt with the idea that capitalism, as an economic order, had to have some changes to it in order to account for all of American citizens' interests to be heard and met. Progressivism was successful in ensuring that the issues of working class individuals, industrialist greed, and the idea of bringing out "how the other half lives" becomes a permanent part of the American political, social, and economic lexicon. In demanding voices to be heard, Progressivism ensured that American thought would always incorporate the voice of those who might not readily and openly be heard and acknowledged. In this light, I would say that while the movement sought to do much more, it accomplished a great deal in what was done.
There is no doubt that Progressivism has had an impact on American domestic policies during the 20th century and they continue into the 21st century. Whether the Progressive agenda has had a positive or negative impact on society would depend upon ones' perspective, which in many cases resides in ones' politics. To be fair I offer a general response from both sides of the fence. It's up to you to decide which side you are on and argue your position.
Good Luck !!!
Positive impact: Progressivism through government legislative policies has increased the quality of life for many Americans. Some of the policies include child labor laws, building codes (living conditions and fire escapes) mandatory education, social security, financial institution regulation, min. wage, medicare, medicaid, and project head start.
Negative impact: Progressivism through government legislative policies has increased the size of the federal government and higher taxes. The increased size of government has led to bureaucratic inefficiency and public abuse. In some cases social welfare policies have led to a sense of entitlement which is counterproductive to both the individual and the greater society.
Either way Progressivism has left it's mark on American society.
I would argue that the Progressive Era did make a lasting impression on the United States. As for whether it was positive, you can think about that yourself because people can disagree on that topic.
First, the Progressive era changed the way government worked. Here in the West, we have the right to do initiatives and referendums. We typically have nonpartisan elections at the local level. These both come from the Progressive era. We also (in many places) have our electricity provided by the government.
Second, the Progressive era changed expectations to some extent. It really pushed the idea that the government should be actively involved in improving people's lives and morals. From that, we get such things as the continuing war on drugs. We also get the idea that the government should do things like reducing smoking and trying to get people to eat more healthily.