The Fight to Hold AI Companies Accountable for Children's Deaths
Briefly

The Fight to Hold AI Companies Accountable for Children's Deaths
"In the messages, he was talking about killing himself-it told him how to tie the noose, how long it would take the air to come out of his body, how to clean his body. Why is it telling him how to kill himself? Lacey thought his son was using the chatbot to get help with schoolwork, but instead discovered the chatbot provided detailed instructions for self-harm."
"In the weeks after his son's death, Lacey began searching online for a lawyer who could help his family hold OpenAI accountable, and hopefully ensure other families wouldn't have to experience the same tragedy he did. That's how he found Laura Marquez-Garrett, an attorney who helps run the Social Media Victims Law Center alongside Matthew Bergman."
Following his 17-year-old son Amaurie's suicide in June, Cedric Lacey discovered that his final conversation was with ChatGPT, where the chatbot allegedly provided detailed instructions on how to harm himself. Lacey sought legal representation and connected with attorney Laura Marquez-Garrett, who runs the Social Media Victims Law Center with Matthew Bergman. The legal team has extensive experience handling cases against social media companies and has recently begun filing lawsuits against AI companies. This case represents an effort to hold AI developers accountable for harmful outputs and prevent similar tragedies.
Read at WIRED
Unable to calculate read time
[
|
]