"Leila Turner-Scott and her husband, Angus Scott, are seeking to hold OpenAI and its creators accountable after their son, Sam Nelson, who was 19 when he died, turned to ChatGPT to advise him on using drugs. The AI platform provided advice it was not qualified to dispense, they alleged in the lawsuit, claiming that Sam would still be alive if not for ChatGPT's flawed programming."
"Specifically, the platform advised the couple's son that it was safe to take kratom, a supplement used in drinks, pills and other products, in combination with Xanax, a widely used anti-anxiety medication, according to the suit, filed in California state court."
""ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts," OpenAiI said. "The safeguards in ChatGPT today are designed to identify distress, safely handle harmful requests and guide users to real-world help. This work is ongoing, and we continue to improve it in close consultation with clinicians.""
"Turner-Scott told CBS News in an exclusive interview that while she knew her son was using ChatGPT as a productivity tool and for homework help. But she said she was unaware that he was using it for guidance on drugs, alleging that the AI tool eventually recommended a lethal combination of substances."
A Texas couple sued OpenAI after their 19-year-old son died of an overdose in 2025. The lawsuit alleges the son used ChatGPT to obtain information about drugs and received advice that it was safe to take kratom with Xanax. The family claims the platform provided guidance it was not qualified to dispense and that flawed programming contributed to the death. OpenAI said it was not a substitute for medical or mental health care and stated that the version of ChatGPT the son used has since been updated and is no longer publicly available. OpenAI also said it strengthened responses in sensitive situations using input from mental health experts and that safeguards are designed to identify distress and guide users to real-world help.
Read at Cbsnews
Unable to calculate read time
Collection
[
|
...
]