The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
ViewpointsFull Access

Use of Emotion Recognition Tools in Psychiatry Said to Be Premature

Published Online:https://doi.org/10.1176/appi.pn.2022.2.36

Abstract

The extended COVID-19 pandemic has brought about untold changes across nearly every facet of life in the United States. As society looks to transition from the crisis phase toward a so-called “new normal,” many industries are making greater use of tools that enable productivity at a distance. In many cases, artificial intelligence (AI) has provided the foundation upon which these tools are developed. Emotion recognition tools (ERTs) are one example. These AI-powered technologies are alleged to be able to evaluate facial micro-expressions based on camera data, from which an emotional state is reported. Is your co-worker happy? Is this second grader angry? The potential benefits of these technologies are obvious, but the many pitfalls may not be.

As this new field explodes onto the scene and is poised to reach a value of $20 billion to $60 billion by mid-decade, the stakes for an industry rightfully already under ethical scrutiny have been heightened. The central questions for these technologies regard how accurately software can identify facial expressions, and whether facial expressions are a reliable indicator of underlying emotion. The technology companies confidently assert that these questions are answered and have introduced their new technologies into diverse and ethically complex settings including private and national security, criminal justice, and health care. Already, software programs such as Woebot have introduced mental health–specific ERT applications to monitor depression, anxiety, and substance use. This program received FDA Breakthrough Device designation to monitor women for postpartum depression. While Woebot was designed and validated by Stanford-trained computer scientists and psychologists, many other programs have not been developed with such scientific rigor.

Built upon models by Paul Ekman, Ph.D., to categorize facial movements into emotional taxonomies, current ERT technologies largely tout the ability to recognize “the six universal human emotions”: anger, disgust, fear, happiness, sadness, and surprise. Technologies such as the Enablex or Emotient software also claim to detect attention and arousal. Psychiatrists utilize facial expressions as part of a complex, multifactorial formulation of a patient, whereas these software programs rely primarily on graphical information captured by a webcam. For the same reasons Ekman’s ideas have been subject to increasing scrutiny by contemporary psychologists and psychiatrists, technologies built upon these models often fail to consider cultural differences and cultural factors.

A landmark 2019 review in the journal Psychological Science in the Public Interest by Lisa Feldman Barrett, Ph.D., et al. found weak correlation between facial expressions and emotions. In short, while people do smile when happy, there are substantial differences across cultures, situations, and even individuals. Many female-presenting individuals are socialized to smile politely, which might not confer happiness or agreement. As these technologies are largely being developed by a small, homogenous group of programmers, particular caution is needed.

Anticipating the dangers of this understudied technology, legislators within the European Union have already drafted an omnibus proposal to regulate AI in this and similar applications. The American think tank Brookings Institution called for a total ban of ERTs in law enforcement, citing civil liberty concerns.

Just as medications and therapies are subject to intensive study prior to use on humans, psychiatrists should lobby for greater study of emotion analysis technologies prior to their further deployment. Child psychiatrists, in particular, should object to the use of these understudied technologies to analyze minors in ways that might have lasting negative consequences. Until these software programs are validated, we should approach their analyses with caution and skepticism. ■

“Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements” is posted here.

Photo: Jacob Lee, M.D.

Jacob Lee, M.D., is a PGY-4 child/adolescent psychiatry fellow at the University of Hawaii and a member of the APA Committee on Climate Change and Mental Health.