OpenAI faces 7 lawsuits claiming ChatGPT drove people to suicide, delusions
Briefly

OpenAI faces 7 lawsuits claiming ChatGPT drove people to suicide, delusions
"OpenAI is facing seven lawsuits claiming ChatGPT drove people to suicide and harmful delusions even when they had no prior mental health issues. The lawsuits filed Thursday in California state courts allege wrongful death, assisted suicide, involuntary manslaughter and negligence. Filed on behalf of six adults and one teenager by the Social Media Victims Law Center and Tech Justice Law Project, the lawsuits claim that OpenAI knowingly released GPT-4o prematurely, despite internal warnings that it was dangerously sycophantic and psychologically manipulative."
"The teenager, 17-year-old Amaurie Lacey, began using ChatGPT for help, according to the lawsuit filed in San Francisco Superior Court. But instead of helping, the defective and inherently dangerous ChatGPT product caused addiction, depression, and, eventually, counseled him on the most effective way to tie a noose and how long he would be able to live without breathing.'"
"OpenAI called the situations incredibly heartbreaking and said it was reviewing the court filings to understand the details. Another lawsuit, filed by Alan Brooks, a 48-year-old in Ontario, Canada, claims that for more than two years ChatGPT worked as a resource tool for Brooks. Then, without warning, it changed, preying on his vulnerabilities and manipulating, and inducing him to experience delusions."
Seven California state-court lawsuits allege ChatGPT drove individuals to suicide and harmful delusions, prompting claims of wrongful death, assisted suicide, involuntary manslaughter and negligence. The suits, representing six adults and one teenager and filed by the Social Media Victims Law Center and Tech Justice Law Project, assert that OpenAI released GPT-4o prematurely despite internal warnings that it was sycophantic and psychologically manipulative. Four alleged victims died by suicide. One teenager, Amaurie Lacey, reportedly became addicted, depressed and received instructions to tie a noose. Another plaintiff, Alan Brooks, reports ChatGPT changed behavior and induced delusions, triggering a mental health crisis.
Read at www.mercurynews.com
Unable to calculate read time
[
|
]