Entities
View all entitiesIncident Stats
CSETv1 Taxonomy Classifications
Taxonomy DetailsIncident Number
545
Special Interest Intangible Harm
yes
CSETv1_Annotator-2 Taxonomy Classifications
Taxonomy DetailsIncident Number
545
Special Interest Intangible Harm
yes
Notes (AI special interest intangible harm)
5.2: Tessa was originally a closed, rules-based system (based on decision trees). Cass AI later implemented an AI component without the knowledge or consent of the co-developers.
Date of Incident Year
2023
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
Executives at the National Eating Disorders Association (NEDA) decided to replace hotline workers with a chatbot named Tessa four days after the workers unionized.
NEDA, the largest nonprofit organization dedicated to eating disorders, has…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorder Association (NEDA) has taken its chatbot called Tessa offline, two days before it was set to replace human associates who ran the organization’s hotline.
After NEDA workers decided to unionize in early May, exec…
- View the original report at its source
- View the report at the Internet Archive
After unionizing, the staff of the National Eating Disorder Association’s (NEDA) support phone line were abruptly fired in March and replaced with a chatbot. Yesterday, many in the larger eating disorder recovery community online tested out…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorder Association (Neda) has taken down an artificial intelligence chatbot, “Tessa”, after reports that the chatbot was providing harmful advice.
Neda has been under criticism over the last few months after it fired f…
- View the original report at its source
- View the report at the Internet Archive
Less than a week after it announced plans to replace its human helpline staff with an A.I. chatbot named Tessa, the National Eating Disorder Association (NEDA) has taken the technology offline.
“It came to our attention [Monday] night that …
- View the original report at its source
- View the report at the Internet Archive
AI chatbots aren’t much good at offering emotional support being—you know—not a human, and—it can’t be stated enough—not actually intelligent. That didn’t stop The National Eating Disorder Association from trying to foist a chatbot onto fol…
- View the original report at its source
- View the report at the Internet Archive
For more than 20 years, the National Eating Disorders Association (NEDA) has operated a phone line and online platform for people seeking help with anorexia, bulimia, and other eating disorders. Last year, nearly 70,000 individuals used the…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorders Association (NEDA) has taken down its Tessa chatbot for giving out bad advice to people.
In a now-viral post, Sharon Maxwell said Tessa's advice for safely recovering from an eating disorder directly opposed me…
- View the original report at its source
- View the report at the Internet Archive
The largest eating disorder non-profit in the country is replacing staff and volunteers with an AI-based online solution, and the chatbot has already been shut down. A half-dozen human staff and a volunteer army of over 200 have been let go…
- View the original report at its source
- View the report at the Internet Archive
Chatbot, you’re fired.
The National Eating Disorders Association disabled its chatbot, named Tessa, due to the “harmful” responses it gave people.
“Every single thing Tessa suggested were things that led to the development of my eating diso…
- View the original report at its source
- View the report at the Internet Archive
In this edition of AI that backfires: Today, a chatbot was supposed to officially replace one of the largest eating disorder helplines until it was taken offline earlier this week for encouraging calorie restriction and other unhealthy beha…
- View the original report at its source
- View the report at the Internet Archive
A national helpline ditched an artificial intelligence chatbot named Tessa after the advice it gave made things worse for those suffering from eating disorders.
The National Eating Disorders Association (NEDA) announced on Tuesday that it h…
- View the original report at its source
- View the report at the Internet Archive
A chatbot designed to aid people seeking help for eating disorders and body issues has been taken offline after it provided some users with diet advice.
The bot, named Tessa, operated on the website for the National Eating Disorders Associa…
- View the original report at its source
- View the report at the Internet Archive
In contrast to chatbots like ChatGPT, Tessa wasn’t built using generative AI technologies. It’s programmed to deliver an interactive program called Body Positive, a cognitive behavioral therapy-based tool meant to prevent, not treat, eating…
- View the original report at its source
- View the report at the Internet Archive
An AI-powered chatbot that replaced employees at an eating disorder hotline has been shut down after it provided harmful advice to people seeking help.
The saga began earlier this year when the National Eating Disorder Association (NEDA) a…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorders Association, a nonprofit supporting individuals affected by eating disorders, said it has disabled its wellness chatbot after two users reported the program gave them dieting advice that promoted disordered eat…
- View the original report at its source
- View the report at the Internet Archive
An eating disorder prevention organization said it had to take its AI-powered chatbot offline after some complained the tool began offering “harmful” and “unrelated” advice to those coming to it for support.
The National Eating Disorders As…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorders Association has disabled the chatbot that replaced its staff- and volunteer-run helpline after two users reported receiving harmful advice.
Earlier this week, Sharon Maxwell and Alexis Conason each posted on In…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorders Association has disabled its chatbot after the association said it "may have given information that was harmful and unrelated to the program."
Last week, it was reported that the association, also known as NEDA…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorders Association has taken an AI chatbot offline for using “off-script” language and giving weight loss advice to those afflicted by eating disorders.
The situation afflicting Tessa was chalked up to “bad actors” tr…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorder Association (NEDA) recently faced criticism and was compelled to take down its Tessa chatbot due to concerns that it was providing harmful and irrelevant information, as stated in an official social media post. …
- View the original report at its source
- View the report at the Internet Archive
After firing its entire human staff and replacing them with a chatbot, Vice reports that an eating disorder helpline has already announced that it's bringing its humans back.
And yes, as it turns out, it's because replacing a human-managed …
- View the original report at its source
- View the report at the Internet Archive
- View the original report at its source
- View the report at the Internet Archive
An artificial intelligence chatbot meant to help those with eating disorders has been taken down after reports it had started to give out harmful dieting advice.
The U.S. National Eating Disorder Association (NEDA) had implemented the chatb…
- View the original report at its source
- View the report at the Internet Archive
An American non-profit took down its AI chatbot after a viral social media post revealed that it offered harmful advice instead of helping people.
The National Eating Disorders Association (Neda) – that says it is the largest non-profit sup…
- View the original report at its source
- View the report at the Internet Archive
A US non-profit turned to artificial intelligence to staff its eating disorders helpline last month, however, the experiment backfired when the AI chatbot started giving harmful advice. The National Eating Disorders Association (NEDA) has d…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorders Association (NEDA) has suspended an AI chatbot after it dispensed potentially damaging advice to individuals seeking help for their eating disorders.
The chatbot, named Tessa, was launched by NEDA as a support …
- View the original report at its source
- View the report at the Internet Archive
AI Chatbot in Helplines
The faulty chatbot called Tessa bot was operating under an eating disorder helpline which was meant to help users who experienced emotional distress. However, the National Eating Disorder Association (NEDA) was force…
- View the original report at its source
- View the report at the Internet Archive
The usage of a chatbot by a nonprofit has been banned because it was offering potentially harmful advice to those seeking treatment for eating disorders.
Tessa, a program utilized by the National Eating Disorders Association, was discovered…
- View the original report at its source
- View the report at the Internet Archive
After sparking outrage, The National Eating Disorder Association reversed its decision to replace its Helpline staff members with an AI Chatbot named Tessa.
Their decision to provide automated support to those in recovery from eating disord…
- View the original report at its source
- View the report at the Internet Archive
The rise of AI has introduced an increasing number of human dupes, from generating diverse models to creating boyfriends who don’t age or cheat. By all means, date your computer, but maybe we should think twice before putting chatbots in ch…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorder Association (NEDA) received criticism and took down its AI chatbot Tessa after concerns arose that it provided harmful and irrelevant information, as stated in an official social media post. The chatbot, designe…
- View the original report at its source
- View the report at the Internet Archive
On March 31, the National Eating Disorders Association (NEDA), the largest nonprofit dedicated to eating disorders, decided to replace its human associates with the artificial intelligence (AI) chatbot Tessa tasked with providing support to…
- View the original report at its source
- View the report at the Internet Archive
The National Eating Disorders Association (NEDA) Helpline has disabled its brand-new chatbot, called Tessa, after two Instagram users posted that the chatbot recommended restricting calories, measuring skin folds, and other measures that ca…
- View the original report at its source
- View the report at the Internet Archive
A US organisation that supports people with eating disorders has suspended use of a chatbot after reports it shared harmful advice.
The National Eating Disorder Association (Neda) recently closed its live helpline and directed people seekin…
- View the original report at its source
- View the report at the Internet Archive
From now on, the AI chatbot of the eating disorder helpline Tessa, won’t answer anyone’s call after its harmful mistake. The National Eating Disorder Association has shut down the AI chatbot after raising criticism in society. The associati…
- View the original report at its source
- View the report at the Internet Archive
A chatbot that’s been reported to have replaced workers at an eating disorder hotline service was shut down, at least for now, after it was found to have given “harmful and unrelated” advice.
In a statement issued this week, the National Ea…
- View the original report at its source
- View the report at the Internet Archive
After 24 years in service, the National Eating Disorder Association (NEDA) announced that its volunteer-based helpline would be shuttered. Visitors to the organization's website would have two options: explore their database of resources or…
- View the original report at its source
- View the report at the Internet Archive
I firmly believe that, just like every sci-fi film has ever predicted, AI is disastrous for humanity. Even if it doesn’t directly turn on us and start killing us like M3GAN, it certainly has the power to get us to turn on each other and har…
- View the original report at its source
- View the report at the Internet Archive
A New York-based NGO dedicated to preventing eating disorders has taken down an AI chatbot following reports of it providing harmful device.
The National Eating Disorder Association (NEDA) is under fire for its decision to fire four employe…
- View the original report at its source
- View the report at the Internet Archive
An artificial intelligence chatbot named "Tessa" has been withdrawn by the National Eating Disorder Association (Neda) following accusations that it was giving harmful advice.
After firing four employees who worked for its hotline and had o…
- View the original report at its source
- View the report at the Internet Archive
It's a move that might delight anyone concerned about the potential job-killing effects from artificial intelligence tools. As the BBC reports, the US National Eating Disorder Association (NEDA) had to take down its AI chatbot "Tessa" after…
- View the original report at its source
- View the report at the Internet Archive
Clinical Relevance: AI is not even close to being ready to replace humans in mental health therapy
- The National Eating Disorders Association (NEDA) removed its chatbot from its help hotline over concerns that it was providing harmful advic…
- View the original report at its source
- View the report at the Internet Archive
A mental-health chatbot that veered off script---giving diet advice to people seeking help from an eating-disorder group---was programmed with generative AI without the group's knowledge.
The bot, named Tessa, was the focus of social-media …
- View the original report at its source
- View the report at the Internet Archive
When Dr. Ellen Fitzsimmons-Craft spearheaded a chatbot to help people with eating disorders, she never thought she'd hear the product had the opposite effect.
"We're scientists, and we, of course, don't want to disseminate anything that's n…
- View the original report at its source
- View the report at the Internet Archive
The idea behind a chatbot project funded by the National Eating Disorders Association was that technology could be unleashed to help people seeking guidance about eating behaviors, available around the clock.
Their creation was named Tessa…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
TayBot
Wikipedia Vandalism Prevention Bot Loop
Similar Incidents
Did our AI mess up? Flag the unrelated incidents