Weapons of Math Destruction Themes
The main themes in Weapons of Math Destruction are risk to marginalized communities, corruption, and human bias.
- Risk to marginalized communities: Decisions made by Weapons of Math Destruction (or WMDs) often amplify and perpetuate discriminatory practices based on wealth, race, gender, and other identity categories.
- Corruption: The use of “Big Data” that is unavailable to the people it affects encourages corrupt practices and the unethical use of people’s information.
- Human bias: In relying on humans to create and alter WMDs, we often preserve human bias within the algorithms themselves.
Themes
Last Updated on April 2, 2020, by eNotes Editorial. Word Count: 352
The central claims O'Neil uses to develop her argument against Big Data are that it preys on the disadvantaged, operates without public knowledge, and justifies old biases that can’t be questioned.
Risk to Marginalized Communities
O'Neil argues that the data industry puts marginalized communities at a specific disadvantage based on their already misfortunate circumstances. She explains how credit scores are increasingly used in hiring decisions, meaning that applicants with lower scores are automatically disqualified. Furthermore, those who live in impoverished neighborhoods are often more heavily watched by law enforcement based on crime prediction models. O'Neil says that this allows for the wealthy to maintain control and for the possibility of upward mobility to diminish.
Corruption
O'Neil explores how the opaque processes by which data analysis operates inevitably lead to corruption. E-scores are one of these processes that quantify and classify individuals based on undisclosed factors. The increasing prevalence of artificial intelligence in creating individual dossiers allows the collection of personal data to be largely invisible to the public. O'Neil contrasts these shady methods of data application with the transparency of FICO credit scores and baseball statistics in order to demonstrate how effective models can be used with proper oversight. Without government or public oversight, O'Neil suggests that Big Data will continue to use our personal information to control and manipulate us.
Human Bias
O'Neil consistently explains how models are inherently biased because they have objectives set by humans. Throughout the book, O'Neil uses various examples to explain how numbers are now used to justify discriminatory practices, which are difficult for those affected to challenge since many are either ill-informed about the way these models work or are shut down by the claim that math is perfectly objective. For example, many models use race as a proxy for evaluating anything from crime recidivism to loan default, allowing institutions to continue racially prejudicial practices while citing data as the reason.
For these reasons and more, O'Neil asserts that it is in the public's own best interest to demand transparency and regulation for Big Data so that models can instead be used for the betterment of society.
See eNotes Ad-Free
Start your 48-hour free trial to get access to more than 30,000 additional guides and more than 350,000 Homework Help questions answered by our experts.
Already a member? Log in here.