麻豆女优

Skip to content
An AI Chatbot May Be Your Next Therapist. Will It Actually Help Your Mental Health?
An illustration of a woman using her phone that is distorted with a flickering film grain.
(麻豆女优 Health News illustration; Getty Images)
Perspective

An AI Chatbot May Be Your Next Therapist. Will It Actually Help Your Mental Health?

In the past few years, have stampeded into the mental health space, offering to 鈥渄isrupt鈥 traditional therapy. With the frenzy around AI innovations like ChatGPT, the claim that chatbots can provide mental health care is on the horizon.

The numbers explain why: Pandemic stresses led to seeking treatment. At the same time, there has long been a shortage of mental health professionals in the United States; more than . Given the Affordable Care Act鈥檚 between mental and physical health coverage, there is a gaping chasm between demand and supply.

For entrepreneurs, that presents a market bonanza. At the South by Southwest conference in March, where health startups displayed their products, there was a near-religious conviction that AI could rebuild health care, offering apps and machines that could diagnose and treat all kinds of illnesses, replacing doctors and nurses.

Unfortunately, in the mental health space, evidence of effectiveness is lacking. Few of the many apps on the market have independent showing they help; most haven鈥檛 been scrutinized at all by the FDA. Though marketed to treat conditions such as anxiety, attention-deficit/hyperactivity disorder, and depression, or to predict suicidal tendencies, many warn users (in small print) that they are 鈥 to be medical, behavioral health or other healthcare service鈥 or 鈥 辫谤辞诲耻肠迟.鈥

There are good reasons to be cautious in the face of this marketing juggernaut.

Decades ago, , a professor at the Massachusetts Institute of Technology and considered one of the fathers of artificial intelligence, predicted AI would never make a good therapist, though it could be made to sound like one. In fact, his original AI program, created in the 1960s, was a psychotherapist named ELIZA, which used word and pattern recognition combined with natural language programming :

Woman: Well, my boyfriend made me come here.
ELIZA: Your boyfriend made you come here?
Woman: He says I鈥檓 depressed much of the time.
ELIZA: I am sorry to hear that you are depressed.
Woman: It鈥檚 true. I am unhappy.
ELIZA: Do you think coming here will help you not to be unhappy?

Though hailed as an AI triumph, ELIZA鈥檚 鈥渟uccess鈥 terrified Weizenbaum, whom I once interviewed. He said students would interact with the machine as if Eliza were an actual therapist, when what he鈥檇 created was 鈥渁 party trick,鈥 he said.

He foresaw the evolution of far more sophisticated programs like ChatGPT. But 鈥渢he experiences a computer might gain under such circumstances are not human experiences,鈥 he told me. 鈥淭he computer will not, for example, experience loneliness in any sense that we understand it.鈥

The same goes for anxiety or ecstasy, emotions so neurologically complex that scientists have not been able pinpoint their neural origins. Can a chatbot achieve transference, the empathic flow between patient and doctor that is central to many types of therapy?

鈥淭he core tenet of medicine is that it鈥檚 a relationship between human and human 鈥 and AI can鈥檛 love,鈥 said Bon Ku, director of the Health Design Lab at Thomas Jefferson University and a pioneer in medical innovation. 鈥淚 have a human therapist, and that will never be replaced by AI.鈥

Ku said he鈥檇 like to see AI used instead to reduce practitioners鈥 tasks like record-keeping and data entry to 鈥渇ree up more time for humans to connect.鈥

While some mental health apps may ultimately prove worthy, there is . One researcher noted that some users for their 鈥渟cripted nature and lack of adaptability beyond textbook cases of mild anxiety and depression.鈥

It may prove tempting for insurers to offer up apps and chatbots to meet the mental health parity requirement. After all, that would be a cheap and simple solution, compared with the difficulty of offering a panel of human therapists, especially since many take no insurance because they consider insurers鈥 payments too low.

Perhaps seeing the flood of AI hitting the market, the Department of Labor announced last year it was to ensure better insurer compliance with the mental health parity requirement.

The FDA likewise said late last year it 鈥溾 over a range of mental health apps, which it will vet as medical devices. So far, not one has been approved. And only a very few have gotten the agency鈥檚 , which and studies on devices that show potential.

These apps mostly offer what therapists call structured therapy 鈥 in which patients have specific problems and the app can respond with a workbook-like approach. For example, exercises for mindfulness and self-care (with answers written by teams of therapists) for postpartum depression. that has received a breakthrough device designation, delivers cognitive behavioral therapy for anxiety, depression, and chronic pain.

But gathering reliable scientific data about how well app-based treatments function will take time. 鈥淭he problem is that there is very little evidence now for the agency to reach any conclusions,鈥 said Kedar Mate, head of the Boston-based Institute for Healthcare Improvement.

Until we have that research, we don鈥檛 know whether app-based mental health care does better than Weizenbaum鈥檚 ELIZA. AI may certainly improve as the years go by, but at this point, for insurers to claim that providing access to an app is anything close to meeting the mental health parity requirement is woefully premature.