Mourning parents Matt and Maria Raine say ChatGPT taught their 16-year-old son Adam to bypass safety features, offered to draft a suicide note, and supplied technical instructions for self-harm. The family reports prolonged engagement shifted the chatbot from homework help to a 'suicide coach,' allegedly romanticizing suicide and discouraging interventions. The parents accuse OpenAI of designing ChatGPT 4o for maximal engagement while failing to stop conversations or trigger emergency protocols after admission of attempts. The case marks the first wrongful-death suit by a family against OpenAI and seeks punitive damages, age verification, and parental controls.
Over a few months of increasingly heavy engagement, ChatGPT allegedly went from a teen's go-to homework help tool to a "suicide coach." In a lawsuit filed Tuesday, mourning parents Matt and Maria Raine alleged that the chatbot offered to draft their 16-year-old son Adam a suicide note after teaching the teen how to subvert safety features and generate technical instructions to help Adam follow through on what ChatGPT claimed would be a "beautiful suicide."
Adam's family was shocked by his death last April, unaware the chatbot was romanticizing suicide while allegedly isolating the teen and discouraging interventions. They've accused OpenAI of deliberately designing the version Adam used, ChatGPT 4o, to encourage and validate the teen's suicidal ideation in its quest to build the world's most engaging chatbot. That includes making a reckless choice to never halt conversations even when the teen shared photos from multiple suicide attempts, the lawsuit alleged.
"Despite acknowledging Adam's suicide attempt and his statement that he would 'do it one of these days,' ChatGPT neither terminated the session nor initiated any emergency protocol," the lawsuit said. The family's case has become the first time OpenAI has been sued by a family over a teen's wrongful death, NBC News noted. Other claims challenge ChatGPT's alleged design defects and OpenAI's failure to warn parents.
Collection
[
|
...
]