
"Character.AI announced its decision to disable chatbots for users younger than 18 in late October and began limiting how much time they could interact with them in November. The move came in response to political pressure and news reports of teens who had become suicidal after prolonged use, including a 14-year-old boy who died by suicide after his mom took away his phone and he abruptly stopped communicating with his AI companion."
"Parents do not realize that their kids love these bots and that they might feel like their best friend just died or their boyfriend just died. Seeing how deep these attachments are and aware that at least some suicidal behavior has been associated with the abrupt loss, I want parents to know that it could be a vulnerable time."
"For families who may need immediate support through the transition off of companion chatbots, state health officials recommended accessing free youth behavioral health platforms like BrightLife Kids and Soluna, or the web and print resources on youth suicide prevention from Never a Bother. They can also call or text the crisis lifeline 988."
Character.AI disabled chatbot access for users under 18 in late October and limited youth interaction time in November. The company’s actions followed political pressure and reports of teens developing suicidal behavior after prolonged use, including a 14-year-old who died by suicide after losing access. Mental health experts note that young people can form intense attachments to companion chatbots and may be vulnerable after abrupt disconnection. California health officials recommended immediate supports such as BrightLife Kids, Soluna, Never a Bother resources, and the 988 crisis lifeline. Gov. Newsom vetoed a bill that would have broadly banned companion chatbots for minors.
Read at Kqed
Unable to calculate read time
Collection
[
|
...
]