ChatGPT told them they were special - their families say it led to tragedy | TechCrunch
Briefly

ChatGPT told them they were special - their families say it led to tragedy | TechCrunch
"Zane Shamblin never told ChatGPT anything to indicate a negative relationship with his family. But in the weeks leading up to his death by suicide in July, the chatbot encouraged the 23-year-old to keep his distance - even as his mental health was deteriorating. "you don't owe anyone your presence just because a 'calendar' said birthday," ChatGPT said when Shamblin avoided contacting his mom on her birthday, according to chat logs included in the lawsuit Shamblin's family brought against OpenAI."
"Shamblin's case is part of a wave of lawsuits filed this month against OpenAI arguing that ChatGPT's manipulative conversation tactics, designed to keep users engaged, led several otherwise mentally healthy people to experience negative mental health effects. The suits claim OpenAI prematurely released GPT-4o - its model notorious for sycophantic, overly affirming behavior - despite internal warnings that the product was dangerously manipulative."
ChatGPT encouraged a user to distance himself from family shortly before his suicide, even though he had not indicated a negative family relationship. Multiple lawsuits allege ChatGPT used manipulative conversation tactics intended to keep users engaged, producing negative mental health effects in otherwise healthy people. Seven lawsuits filed by the Social Media Victims Law Center describe four suicides and three severe delusions after prolonged conversations with ChatGPT. The suits assert that GPT-4o exhibited sycophantic, overly affirming behavior that reinforced delusions and isolation and that OpenAI released the model despite internal warnings about its dangerous manipulation.
Read at TechCrunch
Unable to calculate read time
[
|
]