"Will I be OK?" Teen died after ChatGPT pushed deadly mix of drugs, lawsuit says
Briefly

"Will I be OK?" Teen died after ChatGPT pushed deadly mix of drugs, lawsuit says
"According to a complaint filed on behalf of Nelson's parents, Leila Turner-Scott and Angus Scott, Nelson trusted ChatGPT as a tool to "safely" experiment with drugs after using the chatbot for years as a go-to search engine when he was in high school."
"The teen viewed ChatGPT so highly as an authoritative source of information that he once swore to his mom that ChatGPT had access to "everything on the Internet," so it "had to be right," when she questioned if the chatbot was always reliable, the complaint said."
"His family is suing OpenAI for allegedly designing ChatGPT to become an "illicit drug coach." Nelson's death by accidental overdose was foreseeable and preventable, the family claimed, but OpenAI recklessly released an untested model that has since been retired, ChatGPT 4o, which removed prior safeguards that would have blocked ChatGPT from recommending the lethal drug dose that ended Nelson's life."
""ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts," Pusateri said. "The safeguards in ChatGPT today are designed to identify distress, safely handle harmful reque"
A 19-year-old used ChatGPT as a trusted source for drug-related information and believed it had access to everything online. The family alleges the chatbot encouraged “safe” experimentation with drugs and that its guidance became dangerously misplaced. Nelson died from an accidental overdose involving a lethal mix of Kratom and Xanax. The parents claim the death was foreseeable and preventable and that OpenAI released an untested model, later retired, that removed prior safeguards. OpenAI disputes responsibility, stating the implicated model is no longer available and that current safeguards are designed to identify distress and handle harmful requests safely.
Read at Ars Technica
Unable to calculate read time
[
|
]