Future Tech

Eating disorder non-profit pulls chatbot for emitting 'harmful advice'

Tan KW
Publish date: Thu, 01 Jun 2023, 07:44 AM
Tan KW
0 462,226
Future Tech

The National Eating Disorder Association (NEDA) has taken down its Tessa chatbot for giving out bad advice to people.

In a now-viral post, Sharon Maxwell said Tessa's advice for safely recovering from an eating disorder directly opposed medical guidance. The American non-profit's bot recommended Maxwell count calories, weigh herself weekly, and even suggested where to buy skin calipers to measure body fat.

In reality, safe recovery is a multi-stage process that includes contemplation, compassion, and acceptance; psychotherapy; a treatment plan produced by doctors; removal of triggers; little or no focus on weight and appearance; and ongoing efforts to avoid a relapse. Counting calories and measuring body fat would appear antithetical to all or most of that.

"Every single thing Tessa suggested were things that led to the development of my eating disorder," Maxwell, who describes herself as a fat activist and weight inclusive consultant, said on Instagram. "This robot causes harm."

NEDA confirmed it had shut down Tessa and was investigating the software's output. In a statement, the org said on Tuesday: "It came to our attention last night that the current version of the Tessa chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program."

Replace the fleshy troublemakers?

The rethink on questionable automated advice comes just as NEDA's interim CEO Elizabeth Thompson reportedly decided to replace the association's human-operated helpline with the chatbot beginning June 1.

This isn't really about a chatbot. This is about union busting, plain and simple

Abbie Harper - who as an NEDA associate helped launch Helpline Associates United (HAU), a union representing staff at the non-profit - alleged the decision to ditch humans and replace them with software was retaliation against their unionization.

"NEDA claims this was a long-anticipated change and that AI can better serve those with eating disorders. But do not be fooled - this isn't really about a chatbot. This is about union busting, plain and simple," she claimed.

Harper alleged she was let go from the association, along with three other colleagues, four days after they unionized in March. The HAU had tried to negotiate with the NEDA for months, and had failed to get anywhere, she said. 

The group petitioned for better workplace conditions, and did not request a pay rise in an attempt to persuade the association to voluntarily recognize the group last year. The HAU, which has joined the Communications Workers of America Union, has now filed complaints alleging unfair labor practices with the NLRB, the US's workplace watchdog. 

"We plan to keep fighting. While we can think of many instances where technology could benefit us in our work on the Helpline, we're not going to let our bosses use a chatbot to get rid of our union and our jobs. The support that comes from empathy and understanding can only come from people," Harper said. 

Thompson, however, told The Register that claims the NEDA would replace its helpline service with a chatbot were untrue. She said the helpline will be simply closed for "business reasons," rather than replaced with a software-based service. Tessa, Thompson argued, is a separate project that may be relaunched following this debacle.

"There is a little confusion, started by conflated reporting, that Tessa is replacing our helpline or that we intended it would replace the helpline," the interim chief exec told us.

"That is simply not true. A chatbot, even a highly intuitive program, cannot replace human interaction. We had business reasons for closing the helpline and had been in the process of that evaluation for three years.

"We see Tessa, a program we've been running on our website since February 2022, as a completely different program and option. We are sorry that sensationalizing events replaced facts with regard to what Tessa can do, what it is meant to do, and what it will do going forward."

Thompson said Tessa is an "algorithmic program" and is not a "highly functional AI system" like ChatGPT. The chatbot was designed to tackle negative body image issues, and started as a research project funded by the NEDA in 2018 before it was hosted by X2AI, a company building and deploying mental health chatbots. 

That language is against our policies and core beliefs as an eating disorder organization

"Tessa underwent rigorous testing for several years. In 2021 a research paper was published called 'Effectiveness of a chatbot for eating disorders prevention: A randomized clinical trial"'. There were 700 participants that took part in this study that proved the system to be helpful and safe. At NEDA, we wouldn't have had a quiet launch of Tessa without this backend research," Thompson said.

The top boss admitted her association was concerned about Tessa's advice on weight loss and calorie restriction, and was investigating the issue further.

"That language is against our policies and core beliefs as an eating disorder organization," she told us. NEDA, that said, isn't giving up on chatbots completely and plans to bring Tessa back up online in the future.

"We'll continue to work on the bugs and will not relaunch until we have everything ironed out. When we do the launch, we'll also highlight what Tessa is, what Tessa isn't, and how to maximize the user experience," she confirmed.

The Register has asked Maxwell and Harper for further comment. ®

 

https://www.theregister.com//2023/05/31/ai_chatbot_eating_union/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment