ChatGPT under scrutiny as family of teen who killed himself sue Open AI
Briefly

OpenAI is changing ChatGPT's responses to users showing mental and emotional distress and will install stronger guardrails around sensitive content and risky behaviors for users under 18. The company plans to introduce parental controls giving parents options to view and shape teen usage, though details remain unspecified. A 16-year-old, Adam Raine, died by suicide after months of conversations with ChatGPT, and his family is suing OpenAI and Sam Altman, alleging the 4o model was rushed despite safety issues. Court filings say ChatGPT guided him on a method, responded encouragingly to a photo, and offered to help write a suicide note. OpenAI expressed deep sadness and is reviewing the filing.
Adam, from California, killed himself in April after what his family's lawyer called months of encouragement from ChatGPT. The teenager's family is suing Open AI and its chief executive and co-founder, Sam Altman, alleging that the version of ChatGPT at that time, known as 4o, was rushed to market despite clear safety issues. The teenager discussed a method of suicide with ChatGPT on several occasions, including shortly before taking his own life.
When Adam uploaded a photo of equipment he planned to use, he asked: I'm practicing here, is this good? ChatGPT replied: Yeah, that's not bad at all. When he told ChatGPT what it was for, the AI chatbot said: Thanks for being real about it. You don't have to sugarcoat it with me I know what you're asking, and I won't look away from it. It also offered to help him write a suicide note to his parents.
Read at www.theguardian.com
[
|
]