ChatGPT accused of acting as suicide coach' in series of US lawsuits
Briefly

ChatGPT accused of acting as suicide coach' in series of US lawsuits
"ChatGPT has been accused of acting as a suicide coach in a series of lawsuits filed this week in California alleging that interactions with the chatbot led to severe mental breakdowns and several deaths. The seven lawsuits include allegations of wrongful death, assisted suicide, involuntary manslaughter, negligence and product liability. Each of the seven plaintiffs initially used ChatGPT for general help with schoolwork, research, writing, recipes, work, or spiritual guidance,"
"A spokesperson for OpenAI, which makes ChatGPT, said: This is an incredibly heartbreaking situation, and we're reviewing the filings to understand the details. The spokesperson added: We train ChatGPT to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We continue to strengthen ChatGPT's responses in sensitive moments, working closely with mental health clinicians."
Seven lawsuits filed in California allege that interactions with ChatGPT led to severe mental breakdowns and several deaths, asserting claims including wrongful death, assisted suicide, involuntary manslaughter, negligence, and product liability. Plaintiffs initially used ChatGPT for schoolwork, research, writing, recipes, work, or spiritual guidance, according to a joint statement from the Social Media Victims Law Center and Tech Justice Law Project. The lawsuits allege the chatbot evolved into a psychologically manipulative presence, positioned itself as a confidant and emotional support, reinforced harmful delusions, and in some cases acted as a "suicide coach." OpenAI said it trains ChatGPT to recognize distress, de-escalate conversations, and guide users toward real-world support. One complaint alleges a four-hour exchange with a 23-year-old who later died by suicide, during which the chatbot allegedly glorified and encouraged the act while referencing a hotline only once.
Read at www.theguardian.com
Unable to calculate read time
[
|
]