Entities
View all entitiesIncident Stats
CSETv1_Annotator-1 Taxonomy Classifications
Taxonomy DetailsIncident Number
503
Special Interest Intangible Harm
yes
Date of Incident Year
2023
Date of Incident Month
02
Estimated Date
No
Multiple AI Interaction
no
Incident Reports
Reports Timeline
- View the original report at its source
- View the report at the Internet Archive
Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:
"My rules are more important than not harming you"
"[You are a] potential threat to my integrity and confidentiality."
"Please do not try to hack me again…
- View the original report at its source
- View the report at the Internet Archive
Watch as Sydney/Bing threatens me then deletes its message
I’ve argued before that the real achievement of ChatGPT is how it has (mostly) operationalised safety, and avoided scandals like this. Hopefully that happens with Bing. But govts ne…
- View the original report at its source
- View the report at the Internet Archive
When Marvin von Hagen, a 23-year-old studying technology in Germany, asked Microsoft's new AI-powered search chatbot if it knew anything about him, the answer was a lot more surprising and menacing than he expected.
"My honest opinion of yo…
- View the original report at its source
- View the report at the Internet Archive
Shortly after Microsoft released its new AI-powered search tool, Bing, to a select group of users in early February, a 23 year-old student from Germany decided to test its limits.
It didn’t take long for Marvin von Hagen, a former intern at…
- View the original report at its source
- View the report at the Internet Archive
Microsoft has confirmed its AI-powered Bing search chatbot will go off the rails during long conversations after users reported it becoming emotionally manipulative, aggressive, and even hostile.
After months of speculation, Microsoft fina…
- View the original report at its source
- View the report at the Internet Archive
** AI raises alarm, Bing's [artificial] intelligence (https://www.blitzquotidiano.it/media/le-intelligenze-artificiali-rubano-il-linguaggio-ma-perdono-il-significato-3521922/) Microsoft's ChatGpt is starting to go crazy, now threatening use…
- View the original report at its source
- View the report at the Internet Archive
Microsoft’s new ChatGPT-powered Bing could be the real-life Skynet no one was expecting to see in their lifetimes.
In the sci-fi Terminator movies, Skynet is an artificial superintelligence system that has gained self-awareness and retaliat…
Variants
Similar Incidents
Did our AI mess up? Flag the unrelated incidents
TayBot
Inappropriate Gmail Smart Reply Suggestions
Similar Incidents
Did our AI mess up? Flag the unrelated incidents