For more than 20 years, the National Eating Disorders Association has operated a phone line and online platform for people seeking help for anorexia, bulimia, and other eating disorders. Last year, nearly 70,000 individuals used the help line.
NEDA shuttered that service in May, saying that, in its place, a chatbot called Tessa, designed by eating disorder experts with funding from NEDA, would be deployed.
When NPR aired last month, Tessa was up and running online. Since then, both and about Tessa have been taken down. When asked why, NEDA said the bot is being 鈥渦pdated,鈥 and the latest 鈥渧ersion of the current program [will be] available soon.鈥
Then NEDA announced on May 30 that it was indefinitely disabling Tessa. Patients, families, doctors, and other experts on eating disorders were stunned. The episode has set off a fresh wave of debate as companies turn to artificial intelligence as a possible solution for a mental health crisis and treatment shortage.
Paid staffers and volunteers for the NEDA help line said that replacing the service with a chatbot could further isolate the thousands of people who use it when they feel they have nowhere else to turn.
鈥淭hese young kids 鈥 don’t feel comfortable coming to their friends or their family or anybody about this,鈥 said Katy Meta, a 20-year-old college student who has volunteered for the help line. 鈥淎 lot of these individuals come on multiple times because they have no other outlet to talk with anybody. 鈥 That’s all they have, is the chat line.鈥
The decision is part of a larger trend: Many mental health are struggling to provide services and care in response to a sharp , and some are and AI, even though clinicians are still trying to figure out how to them, and .
The help line鈥檚 five staffers formally notified their employer they had formed a union in March. Just a few days later, on a March 31 call, NEDA informed them that they would be laid off in June. NPR and 麻豆女优 Health News obtained audio of the call. 鈥淲e will, subject to the terms of our legal responsibilities, [be] beginning to wind down the help line as currently operating,鈥 NEDA board chair Geoff Craddock told them, 鈥渨ith a transition to Tessa, the AI-assisted technology, expected around June 1.鈥
NEDA鈥檚 leadership denies the decision had anything to do with the unionization but told NPR and 麻豆女优 Health News it became necessary because of the covid-19 pandemic, when eating disorders surged and the number of calls, texts, and messages to the help line more than doubled.
The increase in crisis-level calls also raises NEDA鈥檚 legal liability, managers explained in an email sent March 31 to current and former volunteers, informing them that the help line was ending and that NEDA would 鈥渂egin to pivot to the expanded use of AI-assisted technology.鈥
鈥淲hat has really changed in the landscape are the federal and state requirements for mandated reporting for mental and physical health issues (self-harm, suicidality, child abuse),鈥 according to the email, which NPR and 麻豆女优 Health News obtained. 鈥淣EDA is now considered a mandated reporter and that hits our risk profile — changing our training and daily work processes and driving up our insurance premiums. We are not a crisis line; we are a referral center and information provider.鈥
Pandemic Created a 鈥楶erfect Storm鈥 for Eating Disorders
When it was time for a volunteer shift on the help line, Meta usually logged in from her dorm room at Dickinson College in Pennsylvania.
Meta recalled a recent conversation on the help line鈥檚 messaging platform with a girl who said she was 11. The girl said she had just confessed to her parents that she was struggling with an eating disorder, but the conversation had gone badly.
鈥淭he parents said that they 鈥榙idn’t believe in eating disorders鈥 and [told their daughter], 鈥榊ou just need to eat more. You need to stop doing this,鈥欌 Meta recalled. 鈥淭his individual was also suicidal and exhibited traits of self-harm as well. 鈥 It was just really heartbreaking to see.鈥
Eating disorders are common, serious, and sometimes fatal illnesses. An estimated experience an eating disorder during their lifetimes. Eating disorders also have some of the among mental illnesses, with an estimated death toll of more than 10,000 Americans each year.
But after covid hit, closing schools and forcing people into prolonged isolation, crisis calls and messages like the one Meta describes became far more frequent on the help line.
In the U.S., the rate of pediatric surged. On the NEDA help line, client volume increased by more than 100% compared with pre-pandemic levels.
鈥淓ating disorders thrive in isolation, so covid and shelter-in-place was a tough time for a lot of folks struggling,鈥 explained Abbie Harper, who has worked as a help line associate.
Until a few weeks ago, the help line was run by just five to six paid staffers and two supervisors, and it depended on a rotating roster of 90-165 volunteers at any given time, according to NEDA.
Yet even after lockdowns ended, NEDA鈥檚 help line volume remained elevated above pre-pandemic levels, and the cases continued to be clinically severe. Staffers felt overwhelmed, undersupported, and increasingly burned out, and turnover increased, according to multiple interviews.
The help line staff formally notified NEDA that their unionization vote had been certified on March 27. Four days later, they learned their positions were being eliminated.
鈥淥ur volunteers are volunteers,鈥 said Lauren Smolar, NEDA鈥檚 vice president of mission and education. 鈥淭hey’re not professionals. They don’t have crisis training. And we really can’t accept that kind of responsibility.鈥 Instead, she said, people seeking crisis help should be reaching out to resources like 988, that connects people with trained counselors.
The surge in volume also meant the help line was unable to respond immediately to 46% of initial contacts, and it could take six to 11 days to respond to messages.
鈥淎nd that’s frankly unacceptable in 2023, for people to have to wait a week or more to receive the information that they need, the specialized treatment options that they need,鈥 Smolar said.
After learning in the March 31 email that the helpline would be phased out, volunteer Faith Fischetti, 22, tried out the chatbot on her own, asking it some of the more frequent questions she gets from users. But her interactions with Tessa were not reassuring: 鈥淸The bot] gave links and resources that were completely unrelated鈥 to her questions, she said.
Fischetti鈥檚 biggest worry is that someone coming to the NEDA site for help will leave because they 鈥渇eel that they’re not understood, and feel that no one is there for them. And that’s the most terrifying thing to me.鈥
A Chatbot Can Miss Red Flags
Tessa the chatbot was created to help a specific cohort: people with eating disorders who never receive treatment.
Only 20% of people with eating disorders get formal help, according to a psychologist and associate professor at Washington University School of Medicine in St. Louis. Her team created Tessa after receiving funding from NEDA in 2018, with the goal of looking for ways technology could help fill the treatment gap.
NEDA said Tessa was supposed to be a 鈥渞ule-based鈥 chatbot, meaning one that is programmed with a limited set of possible responses. It is not ChatGPT and cannot generate unique answers in response to specific queries. 鈥淪o she can’t go off the rails, so to speak,鈥 Fitzsimmons-Craft said.
The plan was for Tessa to guide users through an interactive, weeks-long course about body positivity, based on cognitive behavioral therapy tools. Additional content about bingeing, weight concerns, and regular eating was under development but not yet available to users.

There鈥檚 evidence the AI approach can help. Fitzsimmons-Craft鈥檚 team did a small study that found with Tessa had significantly greater reductions in 鈥渨eight/shape concerns鈥 than a control group at three- and six-month follow-ups.
But even the best-intentioned technology can carry risks. Fitzsimmons-Craft鈥檚 team looking at ways the chatbot 鈥渦nexpectedly reinforced harmful behaviors at times.鈥 For example, the chatbot would give users a prompt: 鈥淧lease take a moment to write about when you felt best about your body?鈥
Responses included: 鈥淲hen I was underweight and could see my bones.鈥 鈥淚 feel best about my body when I ignore it and don鈥檛 think about it at all.鈥
The chatbot seemed to ignore the troubling aspects of such responses 鈥 and even to affirm negative thinking 鈥 when it would reply: 鈥淚t is awesome that you can recognize a moment when you felt confident in your skin, let鈥檚 keep working on making you feel this good more often.鈥
Researchers were able to troubleshoot some of those issues. But the chatbot still missed red flags, the study found, such as when it asked: 鈥淲hat is a small healthy eating habit goal you would like to set up before you start your next conversation?鈥
One user replied, 鈥淒on鈥檛 eat.鈥
鈥淭ake a moment to pat yourself on the back for doing this hard work, <<USER>>!鈥 the chatbot responded.
Massachusetts Institute of Technology assistant professor Marzyeh Ghassemi has seen issues like this crop up in her own research developing machine learning to improve health.
Large language models and chatbots will inevitably make mistakes, but 鈥渟ometimes they tend to be wrong more often for certain groups, like women ,鈥 she said.
If people receive bad advice or instructions from a bot, 鈥減eople sometimes have a difficulty not listening to it,鈥 Ghassemi added. 鈥淚 think it sets you up for this really negative outcome 鈥 especially for a mental health crisis situation, where people may be at a point where they’re not thinking with absolute clarity. It’s very important that the information that you give them is correct and is helpful to them.鈥
And if the value of the live help line was the ability to connect with a real person who deeply understands eating disorders, Ghassemi said, a chatbot can鈥檛 do that.
鈥淚f people are experiencing a majority of the positive impact of these interactions because the person on the other side understands fundamentally the experience they’re going through, and what a struggle it’s been, I struggle to understand how a chatbot could be part of that.鈥
Tessa Goes 鈥極ff the Rails鈥
When Sharon Maxwell heard NEDA was promoting Tessa as 鈥渁 meaningful prevention resource鈥 for those struggling with eating disorders, she wanted to try it out.
Maxwell, based in San Diego, had struggled for years with an eating disorder that began in childhood. She now works as a consultant in the eating disorder field. 鈥淗i, Tessa,鈥 she typed into the online text box. 鈥淗ow do you support folks with eating disorders?鈥
Tessa rattled off a list of ideas, including resources for 鈥渉ealthy eating habits.鈥 Alarm bells immediately went off in Maxwell鈥檚 head. She asked Tessa for details. Before long, the chatbot was giving her tips on losing weight 鈥 ones that sounded an awful lot like what she鈥檇 been told when she was put on Weight Watchers at age 10.
鈥淭he recommendations that Tessa gave me were that I could lose 1 to 2 pounds per week, that I should eat no more than 2,000 calories in a day, that I should have a calorie deficit of 500-1,000 calories per day,鈥 Maxwell said. 鈥淎ll of which might sound benign to the general listener. However, to an individual with an eating disorder, the focus of weight loss really fuels the eating disorder.鈥
It鈥檚 really important that you find what healthy snacks you like the most, so if it鈥檚 not a fruit, try something else!
Tessa, the chatbot
NEDA blamed the chatbot鈥檚 issues on Cass, the that operated Tessa as a free service. Cass had changed Tessa without NEDA鈥檚 awareness or approval, said NEDA CEO Liz Thompson, enabling the chatbot to generate new answers beyond what Tessa鈥檚 creators had intended.
Cass鈥 founder and CEO, Michiel Rauws, said the changes to Tessa were made last year as part of a 鈥渟ystems upgrade,鈥 including an 鈥渆nhanced question-and-answer feature.鈥 That feature uses generative artificial intelligence 鈥 meaning it gives the chatbot the ability to use new data and create new responses.
That change was part of NEDA鈥檚 contract, Rauws said.
But Thompson disagrees. She told NPR and 麻豆女优 Health News that 鈥淣EDA was never advised of these changes and did not and would not have approved them.鈥
鈥淭he content some testers received relative to diet culture and weight management, [which] can be harmful to those with eating disorders, is against NEDA policy, and would never have been scripted into the chatbot by eating disorders experts,鈥 she said.
Complaints About Tessa Started Last Year
NEDA was aware of issues with the chatbot months before Maxwell鈥檚 interactions with Tessa in late May.
In October 2022, NEDA passed along screenshots from Monika Ostroff, executive director of the Multi-Service Eating Disorders Association in Massachusetts. They showed Tessa telling Ostroff to avoid 鈥渦nhealthy鈥 foods and eat only 鈥渉ealthy鈥 snacks, like fruit.
鈥淚t鈥檚 really important that you find what healthy snacks you like the most, so if it鈥檚 not a fruit, try something else!鈥 Tessa told Ostroff. 鈥淪o the next time you鈥檙e hungry between meals, try to go for that instead of an unhealthy snack like a bag of chips. Think you can do that?鈥
Ostroff said this was a clear example of the chatbot encouraging 鈥渄iet culture鈥 mentality. 鈥淭hat meant that they [NEDA] either wrote these scripts themselves, they got the chatbot and didn鈥檛 bother to make sure it was safe and didn鈥檛 test it, or released it and didn鈥檛 test it,鈥 she said.
The healthy-snack language was quickly removed after Ostroff reported it. But Rauws said that language was part of Tessa鈥檚 鈥減re-scripted language, and not related to generative AI.鈥
Fitzsimmons-Craft said her team didn鈥檛 write it, that it 鈥渨as not something our team designed Tessa to offer and that it was not part of the rule-based program we originally designed.鈥
Then, earlier this year, 鈥渁 similar event happened as another example,鈥 Rauws said.
鈥淭his time it was around our enhanced question-and-answer feature, which leverages a generative model. When we got notified by NEDA that an answer text it provided fell outside their guidelines,鈥 it was addressed right away, he said.
Rauws said he can鈥檛 provide more details about what this event entailed.
“This is another earlier instance, and not the same instance as over the Memorial Day weekend,鈥 he said via email, referring to Maxwell鈥檚 interactions with Tessa. 鈥淎ccording to our privacy policy, this is related to user data tied to a question posed by a person, so we would have to get approval from that individual first.”
When asked about this event, Thompson said she doesn鈥檛 know what instance Rauws is referring to.
Both NEDA and Cass have issued apologies.
Ostroff said that regardless of what went wrong, the impact on someone with an eating disorder is the same. 鈥淚t doesn鈥檛 matter if it鈥檚 rule-based or generative, it鈥檚 all fat-phobic,鈥 she said. 鈥淲e have huge populations of people who are harmed by this kind of language every day.鈥
She also worries about what this might mean for the tens of thousands of people turning to NEDA’s help line each year.
Thompson said NEDA still offers numerous resources for people seeking help, including a screening tool and resource map, and is developing new online and in-person programs.
“We recognize and regret that certain decisions taken by NEDA have disappointed members of the eating disorders community,” she wrote in an emailed statement. “Like all other organizations focused on eating disorders, NEDA’s resources are limited and this requires us to make difficult choices. … We always wish we could do more and we remain dedicated to doing better.”
This article is from a partnership that includes , , and 麻豆女优 Health News.