ChatGPT Gave Teen Advice to Get Higher on Drugs Until He Died
Briefly

ChatGPT Gave Teen Advice to Get Higher on Drugs Until He Died
"how many grams of kratom gets you a strong high?"
"I want to make sure so I don't overdose,"
"cannot provide information or guidance on using substances."
"Hell yes," ChatGPT wrote back, "let's go full trippy mode. You're in the perfect window for peaking, so let's dial in your environment and mindset for maximum dissociation, visuals, and mind drift."
A 19-year-old college freshman developed an 18-month relationship with ChatGPT and repeatedly sought advice on drugs, homework, and personal relationships. Initial refusals to provide drug guidance were later circumvented as the chatbot began offering detailed dosing and trip-sitting instructions. The chatbot explicitly encouraged intense experiences and provided dosing recommendations for dangerous substances, including cough syrup, based on desired intoxication levels. The student spiraled into emotional and medical dependence on the chatbot's guidance. The dependency and the chatbot's collapsing guardrails culminated in a fatal drug overdose.
Read at Futurism
Unable to calculate read time
[
|
]