The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Clinical & ResearchFull Access

Popularity of Mental Health Chatbots Grows

Abstract

Consumers have shown an interest in the empathic dialogue available 24/7 through an AI-powered mental health chatbot. But high-quality research into the effectiveness of these tools is lacking.

Photo: hand holding a cell phone in front of a laptop
iStock/NicoElNino

Over the past few years, people experiencing symptoms of mental distress have started to turn to a new type of service for support: mental health chatbots. With a few clicks of a cell phone, chatbots can offer a channel through which one can receive empathic and nonjudgmental text communication at a moment’s notice, 24 hours a day.

Chatbots are programs that use language-processing software and artificial intelligence (AI) to engage in dialogue with humans. Well-known chatbots include Apple’s Siri and Amazon’s Alexa, which respond to commands (“Alexa, set timer for five minutes.”) and answer one-off questions (“Siri, what’s the weather forecast for today?”). But many companies believe chatbots can fill an important health care gap by providing an outlet for people who want to talk about their mental health outside of a regular clinical visit.

Consumers certainly seem interested in mental health chatbots. A 2021 national survey commissioned by Woebot Health, one of the leading therapeutic chatbot companies, found that 22% of adults had used a mental health chatbot, and 47% said they would be interested in using one if needed. Among the respondents who had tried a mental health chatbot, nearly 60% said they began this use during the COVID-19 pandemic, and 44% said they used chatbots exclusively and do not see a human therapist. The most common reasons people cited as to why they used and/or might be interested in using a chatbot included that the tool was cheap, easy to use, and accessible anytime.

“It’s rare in psychiatry that we see really novel advances in clinical care, but chatbots could be transformative because they can offer constant access to care,” said John Torous, M.D., director of the Division of Digital Psychiatry at Harvard-affiliated Beth Israel Deaconess Medical Center. Torous said that he has been monitoring the progress of chatbots over the past several years.

He noted that the acceptance of these tools has been a pleasant surprise, since people inherently distrust objects that try to pass as human. He said the success of chatbots such as Woebot may be that the programs are upfront about their artificial nature and do not try to pretend to be human. For example, Woebot appears as a robot during chats while another chatbot known as Wysa takes the form of a cartoon penguin; the programs also acknowledge their AI nature in conversations.

The Potential of AI Dialogue

Woebot founder and President Allison Darcy, Ph.D., came up with the idea for Woebot while conducting postdoctoral research at Stanford University. As a clinical psychologist who specialized in child and adolescent mental health, Darcy was familiar with the challenges of keeping youth engaged with behavioral therapy—particularly the “homework” assignments therapists ask patients to complete between sessions. Recognizing that technology might offer a natural conduit to engage youth with therapy, she began developing self-help programs that included AI dialogue. However, she quickly realized the potential of making dialogue the star of the show, which led her to team with Andrew Ng, Ph.D., an AI expert at Stanford’s Computer Science Department, and create Woebot in 2017.

Photo: Allison Darcy, Ph.D.

Chatbots are not meant to be surrogates for human-delivered psychotherapy, even as they become more sophisticated, said Allison Darcy, Ph.D.

Woebot

Many other companies have leveraged smartphone technology to increase people’s access to live therapeutic support via chat or video communications, but these apps are constrained by the limited number of trained personnel available. An AI-driven support program that uses techniques from established human therapies like cognitive-behavioral therapy is not bound by such resource restrictions.

“We need to liberate the discussion of mental health from the confines of the clinic, which means providing tools people can use whenever they are needed,” Darcy told Psychiatric News.

Eduardo Bunge, Ph.D., the associate chair of psychology at Palo Alto University and director of the Children and Adolescents Psychotherapy and Technology Research Lab, can attest to the effectiveness of chatbots in the right scenarios.

Bunge, who has experience evaluating chatbots, admitted that he used to be skeptical of the ability of a chatbot to “really dial in to what someone needs psychologically at the moment.” One day during a moment of stress, however, he decided to try a chatbot. “It offered exactly what I needed,” he said. “At that point I realized there is something relevant going on here.”

Bunge stressed that chatbots work best when the user can identify the problem he or she faces. For example, a person might text, “I’m anxious and can’t fall asleep” to the chatbot. In turn, the chatbot might respond by asking for more details, affirming the person’s feelings, and then offering some mindfulness-based tips to fall asleep.

But if a person doesn’t have a clear idea of the problem or has several concurrent problems, the chatbot may be more limited in the help it can offer.

Chatbots are not meant to be surrogates for human-delivered psychotherapy, even as they become more sophisticated, Darcy acknowledged. “A tennis ball machine will never replace a human opponent, and a virtual therapist will not replace a human connection,” she said. Chatbots rather support human clinicians by providing situational help to patients in between human appointments, she added.

Chatbots can also serve to introduce people to psychotherapy who might otherwise be wary.

Darcy told Psychiatric News that some research the company conducted found people who identified as nonbinary and those with less education reported feeling the most connected with Woebot compared with other groups.

“These people and other marginalized groups realize that the AI comes to the table in a nonjudgmental way,” she said. The bond users develop with chatbots might help them trust the therapeutic process more, Darcy continued. “Anything we can do to make the on ramp to human therapy easier to get on is invaluable,” she said.

But Are Bots Effective?

“I would like to see all the reports about the interest in this technology balanced with high-quality clinical evaluations,” Torous said. “There are studies out there discussing how these chatbots are engaging or empathic, but across the industry few people are asking about efficacy.”

Darcy noted that Woebot has conducted a preliminary study in young adults that suggests the chatbot can help lower depressive symptoms better than an online book on depression but agrees that more clinical data are essential.

“We are still a work in progress, but our end goal is to have products that meet the full FDA criteria for digital therapies,” she said. The company currently has several products in the clinical pipeline, including digital therapeutics focused on adolescent depression and postpartum depression.

In the meantime, both Bunge and Torous think commercial-grade chatbots can be a part of a clinical care regimen as with other health-related apps, so long as patients are made aware of the benefits and limitations of the technology.

“I am not available at 6 a.m., but a 10-minute chatbot conversation is available to help someone get to bed or get out of bed if they wake up depressed,” Bunge said. “Since [chatbots] are text based, my patient and I can also review the conversation together in our session, which helps propel our own dialogue.” ■