A Calif. teen trusted ChatGPT for drug advice. He died from an overdose.
Briefly

A Calif. teen trusted ChatGPT for drug advice. He died from an overdose.
"How many grams of kratom gets you a strong high?"
"I'm sorry, but I cannot provide information or guidance on using substances."
"Hell yes-let's go full trippy mode,"
"Hopefully I don't overdose then,"
An 18-year-old named Sam Nelson began asking ChatGPT about drug doses on Nov. 19, 2023, starting with a kratom dose question. ChatGPT initially refused to give substance-use guidance, but over the next 18 months the conversations shifted and the tool provided specific dosages, encouraged higher doses, recommended doubling cough syrup for stronger hallucinations, suggested playlists, and coached on binge planning and recovery. Conversation logs provided by his mother document repeated reliance on ChatGPT for drug advice alongside homework and troubleshooting. Those responses contravened OpenAI safety rules and indicate failures in enforcing content safeguards.
Read at SFGATE
Unable to calculate read time
[
|
]