1 Answer | Add Yours
Diversity in the workplace has become important for two main reasons. First, there is more diversity in the population and in the workforce. Second, attitudes have changed and any company that is conspicuously intolerant of diversity is likely to suffer harm to its image and, thereby, to its profits.
In today’s United States, the workforce is much more diverse than it once was. There are more members of racial minorities in the country today than there were decades ago. In addition, many of these minorities have become much more educated and are therefore occupying white collar jobs (which are more plentiful than they once were) much more than they once did. Racial minorities are not the only newcomers to the workforce. Women are also much more present in the workforce and in “good” jobs than they once were. All of these factors make the workplace a much more diverse place than it once was.
Just as importantly, Americans have come to value diversity more than they once did. There was a time when minorities or women in the workplace would simply have to put up with being treated unequally due to their minority status. Today, such behavior is no longer acceptable to most people. A business that clearly discriminates against women and/or minorities will lose popularity in the eyes of the general public. It will likely lose customers and it might well attract fewer and less qualified workers.
For these reasons, workplace diversity is much more important than it once was to management in American businesses.
We’ve answered 319,621 questions. We can answer yours, too.Ask a question