Weapons of Math Destruction Characters
Weapons of Math Destruction does not have characters in a traditional sense, but its key figures include Sarah Wysocki, Robert McDaniel, and Kyle Behm. Each of these people are examples of how Weapons of Math Destruction (or WMDs) can affect lives based on bias and insufficient information.
- Sarah Wysocki, a teacher, was fired because an algorithm rated her poorly on her ability.
- Robert McDaniel was singled out by a Chicago Police Department algorithm that attempted to predict future crime.
- Kyle Behm was rejected from jobs as a result of personality tests that made assumptions about his potential performance.
Last Updated on January 13, 2020, by eNotes Editorial. Word Count: 504
At the outset of her book "Weapons of Math Destruction," Cathy O'Neil identifies three characteristics of the kinds of algorithms she considers weapons. They are opaque or invisible, damaging to their subjects, and can be scaled-up exponentially. The work is nonfiction and not biographical and as such does not contain...
(The entire section contains 504 words.)
See This Study Guide Now
At the outset of her book "Weapons of Math Destruction," Cathy O'Neil identifies three characteristics of the kinds of algorithms she considers weapons. They are opaque or invisible, damaging to their subjects, and can be scaled-up exponentially. The work is nonfiction and not biographical and as such does not contain characters in a conventional sense. However, O'Neil describes the plight of several real-life victims of these algorithms.
The author takes up the case of Sarah Wysocki, a teacher who was fired because an algorithm gave her teaching ability a low rating. There was evidence that the evaluation generated by the algorithm was based on falsified student scores. She complained to her district administrator, Jason Kamras, who admitted that the test scores seemed suspicious but nonetheless refused to reinstate Wysocki.
The author claims that feedback loops arise from the use of algorithms. In Wysocki's case, the algorithm in question produced easily understood results: there were a couple of hundred underperforming teachers who should be fired. The algorithm appears to work, because it delivers clear results, while its complexity makes it difficult to criticize. Wysocki failed to convince her superiors of her case, but thanks to her teaching abilities and good reputation she quickly secured another job outside of the district.
In 2009, the Chicago Police Department received a two-million-dollar grant from the National Institute of Justice to develop a predictive program for crime. However, the program they chose went beyond identifying possible crime-ridden areas. It singled out four hundred individuals deemed potential lawbreakers. Among these was twenty-two-year-old Robert McDaniel. McDaniel had not committed any crimes. However, he grew up in a dangerous neighborhood and many of his acquaintances had had brushes with the law. It was sheer bad luck that led to his being singled out by the algorithm. His chances of falling afoul of the police in the future were greatly increased because of a faulty model. O'Neil disagrees with the whole approach underlying the algorithm and argues that instead of singling out individuals, government organizations should emphasize supporting crime-ridden areas.
Kyle was a young man who had been diagnosed with bipolar disorder and, after receiving treatment, had returned to his studies at Vanderbilt University. Subsequently, he applied for a series of low-skilled jobs but was turned down for all of them. With the aid of his father, an attorney, he discovered that this was due to the results of a personality test he had been given by his prospective employers. The test was part of an employee selection program designed by Kronos, a workforce management company based in Boston. Kyle was devastated by the experience and the test's outcome, which made him seriously question his abilities. As the author points out, it is illegal to subject jobseekers to medical testing. At the time that her book came out, the Behm family was still caught up in the courts awaiting a decision on his case. Kyle Behm is a disturbing example of how algorithms can be damaging to those subjected to them.