- Download PDF
1 Answer | Add Yours
Perhaps you could ask this question in a more detailed way so that you can get an answer that is truly helpful to you.
Racism can generally be defined as the feeling that people of your own race are superior to people of one or more other races. This has long been a problem in the United States. One thing that is useful to know about racism in the United States is that it has changed very greatly in the last 30 or 40 years. In times past, it was much more socially acceptable to express openly racist points of view. Today, this is no longer true. Overt racism is a complete taboo in almost every part of American society. This has brought us to the point where there is a great deal of debate as to whether racism is really important in American society. Many people think that it is not. Others, however, feel that racism still exists but that people who hold racist beliefs are just much less open about expressing those beliefs.
However, there are many other aspects to racism that you might want us to address. Please let us know what, exactly, you would like discussed.
We’ve answered 324,480 questions. We can answer yours, too.Ask a question