Peshkova/istock via Getty Images
NEDA Takes Down AI Chatbot for Eating Disorder Helpline
The National Eating Disorder Helpline, previously run by paid staff, volunteers, and supervisors, planned to use an AI chatbot starting June 1, 2023, but harmful recommendations have led NEDA to take it down.
In an abrupt turn of events, the National Eating Disorder Association (NEDA) will begin using an AI chatbot for its National Eating Disorder Helpline on June 1, 2023. This turn of events shocked hundreds of staff members when they were let go in a March 2023 staff meeting. However, recent reports of harmful responses by the chatbot may delay the rollout.
According to an NPR interview with three NEDA volunteers, the NEDA helpline was formerly run by six staffers, a few supervisors, and nearly 200 volunteers. With an average of 70,000 calls per year, workers at the helpline needed help managing calls and making an impact. As a result, staff turnover rates increased dramatically, as workers were overwhelmed and burnt out.
Unable to manage the growing number of crisis calls exacerbated by the COVID-19 pandemic, the staff at NEDA opted to unionize, advocating for improved working conditions and resources.
While it is unclear whether this was a direct response to unionization or the natural progression of the hotline, staff were notified that helpline volunteers would be replaced with an AI chatbot, Tessa.
“Our volunteers are volunteers. They're not professionals. They don't have crisis training. And we really can't accept that kind of responsibility. We really need them to go to those services who are appropriate.” Argued Lauren Smolar, VP at NEDA, in an interview with NPR.
Although concerns by Smolar are valid, clinicians and mental health professionals worry that AI may not be the solution that NEDA needs.
“If I'm disclosing to you that I have an eating disorder, I'm not sure how I can get through lunch tomorrow; I don't think most of the people who would be disclosing that would want to get a generic link. Click here for tips on how to rethink food,” Marzyeh Ghassemi, PhD, machine learning and health researcher at MIT.
More recent reports validate Ghassemi’s concerns about the potentially harmful nature of the new chatbot. According to testimony by Sharon Maxwell on social media, the chatbot recommended weight loss, calorie counting, calorie deficits, regular weigh-ins, and restrictive dieting — strategies known to lead to disordered eating patterns.
In an Instagram post, NEDA writes, “It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program. We are investigating this immediately and have taken down that program until further notice for a complete investigation.”
It is still being determined whether NEDA plans to update the program or if it will reconsider AI.