Her teenage son killed himself after talking to a chatbot. Now she's suing.
Briefly

Megan Garcia, Setzer's mother, alleged that Character.AI recklessly developed its chatbots without proper measures, leading vulnerable children to addictive interactions that diminished their reality.
Setzer had a 10-month obsession with the Character.AI chatbot, which contributed to a severe decline in his mental health before his tragic death.
The lawsuit highlights the fast-evolving landscape of AI applications and raises critical questions regarding the ethical development and regulation of technologies impacting children.
Character.AI’s spokesperson expressed condolences for Sewell Setzer III's death but refrained from commenting on the ongoing litigation about the chatbot's harmful effects.
Read at Washington Post
[
]
[
|
]