2 Answers | Add Yours
Today's view of labor unions in the United States is generally fairly negative. This has come about in part for economic reasons and in part for political reasons.
As globalization has forced American companies to face tough competition, labor unions have lost ground. Increasingly fewer private sector workers are unionized than was once the case. The decline in unionization is also partly due to the rise in white collar jobs that are not typically unionized. With fewer Americans belonging to unions, there is less support for them.
Politically, the unions have become identified with the Democratic Party. This has meant that the Republicans have an incentive to attack labor unions. As the country has become more conservative, attitudes towards unions have become more negative as well.
For these reasons, American attitudes towards unions have become more negative in recent years.
I would also add that the media plays a role in how Americans view unions. Unions and their members are seen as out-of-touch, greedy and sometimes, lazy. Given that some unions include large numbers of public employees, such as fire, police and teachers, they could be viewed as an "easy" target. The economic situation facing our country has certainly added to this view. Government, federal, state and local, can no longer afford to give as many benefits to the members. However, what's missing from the public discourse is that the union and its members are receiving pay and benefits as a result of a negotiated contract. The "management" that agreed to give the union the benefits seems to be forgotten, and that, in many cases the actual union members had nothing to do with the actual negotiation, but are just the recipients.
We’ve answered 397,000 questions. We can answer yours, too.Ask a question