Vincent's article is about the development of a new system that uses facial recognition to predict sexual orientation. Vincent makes a number of points about this system:
- Claims that the system is correct "81 percent" of the time are invalid. He points out that the algorithm was used on photos posted to an online dating website, which probably made it easier for the algorithm, since the photos were designed to attract people of the appropriate sexual orientation.
- Vincent suggests that such technology is a digital twist on racial stereotyping. Referencing the pseudo-science of physiognomy, and the practice of measuring head size as a way of predicting intelligence.
- Such systems can represent significant risks to privacy. In the case of countries where homosexuality is illegal, such systems could be used to automatically screen for sexual preference, leading to systemic persecution and, in some cases, imprisonment.
Vincent concludes that, while the accuracy of such systems is in doubt, what is more important is the faith people put in such technology. "If people believe AI can be used to determine sexual preference, they will use it," he writes. What is needed, he suggests, is a more critical stance toward AI as a whole, and an understanding of its limitations.