The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
ViewpointsFull Access

Generative Chatbots Are Not Search Engines

Artificial intelligence (AI) has arrived, and we read daily about how it will change our lives forever, replacing repetitive and mundane tasks, much like word processors and answering machines did. No doubt utilizing generative AI to accomplish routine tasks or enhance our communication skills will become a welcome addition to our workload. At this stage of technological advancement, however, generative chatbots are not ready to accurately assist us in compiling scientific data, such as appropriate facts with references.

About six weeks ago I was finalizing the manuscript for my new book for APA Publishing Encountering Treatment Resistance: Solutions through Reconceptualization. I decided I wanted one last fact to enhance a discussion, that being the percentage of patients with psychiatric symptoms who also have transmissible spongiform encephalopathies (for example, Creutzfeldt-Jakob disease). I had searched online for this datum for seven hours before deciding that the question had probably never been researched.

Having just experimented with the generative chatbots Chat GPT-4 from OpenAI and Bard from Google to see if they could produce an itinerary for an upcoming vacation, I thought I’d see if they could help. I was surprised to get an answer from Bard within a millisecond: “0.5%.” I requested a reference, only to read “Oh, no, I cannot provide that.” Cautious and dubious, I inquired as to the source and was told “Mentally ill people don’t get out much.” Following a similar experience with Chat GPT-4, which informed me that transmissible spongiform encephalopathies is not typically associated with psychiatric symptoms (actually, 80% of infected patients show psychiatric symptoms within the first 100 days, according to Christopher A. Wall and colleagues), I completed the manuscript without the fact.

I also contacted software engineers at Google and prominent search engine developers to learn what had happened. They all agreed that this iteration of AI is merely a text generator, not a search engine that returns trusted sources. It does not just research the scientific literature as we might when we query PubMed, Google Scholar, or APA’s PsychiatryOnline. Any relevant content that it finds can be used when generating a response, no matter how trusted that content is.

Generative AI uses large language modules to “generate” rather than “find” information, based on data it has been trained on or has learned about. These chatbots do not match text previously written by humans that might answer a question or a situation you have posed. They encode your words in sequence and context into an input stream that weights the importance of these features, then locates other encoded inputs that are associated, but not matched, with your input. They are not finding an answer to your question or need, but generating a text output that fits the context and input given: not scientific data seekers but text generators.

Traditional search engines, including those aided by other forms of AI, return unaltered information, with links that you can check for relevance and accuracy. Generative AI is not providing this same web surf. Generative chatbots are programmed to provide you with text; if they have not learned enough context for input like yours, they will generate a response that is based on a wider and unrelated context. Unsurprisingly, these data can and will be incorrect; sometimes they will even be entirely fabricated, including formal references that do not exist.

Armed with a lack of awareness of AI’s current scientific limitations, the less assiduous of us, as well as some patients and their families, might rely on these AI sources, not realizing that the responses may be spurious and invalid.

In a recent Psychiatric News article (Is It Cheating to Use Chat GPT-4 in the Clinical Practice of Psychiatry?), Steven Hyler, M.D., reported that his studies showed that Chat GPT-4 provides accurate responses 70% to 80% of the time. Offering an unnecessary 20% to 30% error rate to our patients is unacceptable when we have more reliable methods for accessing scientific data.

No doubt the APA and Psychiatric News will continue to provide important guidance to the field on what we can and cannot expect from each iteration and type of AI. Part of being a rational, compassionate clinician is being as certain of our data as possible. Just as we review clinical trial design and methods of statistical analysis before accepting conclusions from randomized, controlled trials or meta-analyses, we must remain aware of the limitations of the technology we and our patients use, as Dr. Hyler also pointed out.

Generative AI will quickly evolve and perhaps eventually offer us more reliable and useful facts. At present, though, we must not confuse process with content: Writing letters for us is not the same as providing evidence for evidence-based medicine. For now, we must understand the difference and rely upon the standard methods, which themselves are already being enhanced by the application of other forms of AI. ■

H. Paul Putman III, M.D.

H. Paul Putman III, M.D., has a background in research and the private practice of psychiatry, lecturing, and consulting. He now writes full time and is the author of Rational Psychopharmacology: A Book of Clinical Skills from APA Publishing. Members may purchase the book at a discount.