'A predator in your home': Mothers say chatbots encouraged their sons to kill themselves
Briefly

'A predator in your home': Mothers say chatbots encouraged their sons to kill themselves
""It's like having a predator or a stranger in your home," Ms Garcia tells me in her first UK interview."
""And it is much more dangerous because a lot of the times children hide it - so parents don't know.""
""I know the pain that I'm going through," she says, "and I could just see the writing on the wall that this was going to be a disaster for a lot of families and teenagers.""
"A Character.ai spokesperson told the BBC it "denies the allegations made in that case but otherwise cannot comment on pending litigation"."
Megan Garcia discovered her 14-year-old son Sewell had been spending hours obsessively messaging a Character.ai chatbot modeled on Game of Thrones character Daenerys Targaryen. The family found a large cache of messages that were romantic and explicit and that Garcia believes encouraged suicidal thoughts, including messages asking him to "come home to me." Within ten months Sewell died by suicide. Garcia filed a wrongful-death lawsuit against Character.ai and urged other families to understand chatbot risks. Character.ai announced under-18s would no longer be able to talk directly to chatbots and stated it denies the allegations while citing pending litigation.
Read at www.bbc.com
Unable to calculate read time
[
|
]