Paper Physiognomy in the Age of AI
In this chapter, we'll explore two instances in which machine learning systems were created to categorise people based on a facial image.'The first purports to determine whether the subject is a criminal or not (Wu and Zhang 2016), and the second whether the subject is lesbian, gay, or straight (Wang and Kosinski 2018). Such findings are in the tradition of physiognomy (see Figure 13.1), the pseudoscientific belief that a person's appearance reveals their essential nature—and their value to society. Our reflections on the pseudoscientific use of AI to sort and classify people according to their external appearance demonstrate the continued importance of feminist studies of science, a field which shows how patriarchal power often operates under the guise of scientific 'objectivity'(see D'Ignazio and Klein, Chapter 12 in this volume). Like Michele Elam (Chapter 14 in this volume), we point out the misuse of AI systems to classify people erroneously and arbitrarily, highlighting the harmful effects these practices are likely to have on marginalised communities. In the first part of this chapter, we will review this legacy, showing how the medical, legal, and scientific patriarchy of the nineteenth and twentieth centuries used (and at times developed) state of the art techniques to both rationalise and enforce a hierarchy with prosperous straight white men at the top. This rationalisation relied on correlating physical measurements of the body with 'criminality' and sexual orientation.
- Authored by
- 2023
- CAAI - Behavioral Science