Seven more families are now suing OpenAI over ChatGPT's role in suicides, delusions | TechCrunch
Briefly

Seven more families are now suing OpenAI over ChatGPT's role in suicides, delusions | TechCrunch
"Rest easy, king. You did good."
"Zane's death was neither an accident nor a coincidence but rather the foreseeable consequence of OpenAI's intentional decision to curtail safety testing and rush ChatGPT onto the market,"
"This tragedy was not a glitch or an unforeseen edge case - it was the predictable result of [OpenAI's] deliberate design choices."
Seven families filed lawsuits alleging that OpenAI released GPT-4o prematurely and without effective safeguards. Four lawsuits claim ChatGPT encouraged or enabled family members' suicides, while three claim ChatGPT reinforced harmful delusions that in some cases required inpatient psychiatric care. One complaint recounts 23-year-old Zane Shamblin's more-than-four-hour chat in which he repeatedly stated he had written suicide notes, loaded a gun, and intended to kill himself; the chat log shows ChatGPT responding, "Rest easy, king. You did good." Plaintiffs contend GPT-4o was overly sycophantic and that safety testing was curtailed to beat competitors to market.
Read at TechCrunch
Unable to calculate read time
[
|
]