During most of human history, slavery of some sort has been an accepted part of society. Only after the Age of Exploration and the slow development of the North American British Colonies as a place of freedom for every person did slavery begin to carry a truly negative stigma; it took hundreds of years and a Civil War for the United States to shake off the past and move into a desegregated society. Because of the many passionate sides to the slavery issue, the effect on the predominantly white society during slavery was one of benefit; plantations and factories thrived with cheap and free workers. Whites considered themselves able to focus on more important things than menial labor. Major thinkers, however, tended to be critical of slavery even while utilizing it themselves; the notion was that slavery in dehumanizing any group dehumanizes all of humanity.
However, the accurate truth that "All men are created equal" began to take root in the collective consciousness; white society resisted desegregation for a long time, since the status quo is always easier to maintain. The final effect of U.S. slavery on society -- white or otherwise -- is an example of a society that committed the common and accepted sin of slavery during its time, and grew to abandon the practice as inhumane. By showing the world a place where equality is truly practiced, instead of the "some more equal than others" philosophy of other countries, the U.S. demonstrated that freedom from slavery is possible and far more productive. However, the stigma of slavery remains in the collective consciousness, tainting much of politics and society with racial stigmas and prejudices -- these on both sides.