#self-harm-prevention

[ follow ]
fromFast Company
1 week ago

Governor Newsom vetoed a bill restricting kids' access to AI chatbots. Here's why

The bill would have banned companies from making AI chatbots available to anyone under 18 years old unless the businesses could ensure the technology couldn't engage in sexual conversations or encourage self-harm. While I strongly support the author's goal of establishing necessary safeguards for the safe use of AI by minors, (the bill) imposes such broad restrictions on the use of conversational AI tools that it may unintentionally lead to a total ban on the use of these products by minors," Newsom said.
California
#ai-regulation
fromFortune
1 week ago
California

Gavin Newsom signs law to regulate AI, protect kids and teens from chatbots | Fortune

fromFortune
1 week ago
California

Gavin Newsom signs law to regulate AI, protect kids and teens from chatbots | Fortune

Mental health
fromWIRED
3 weeks ago

OpenAI Adds Parental Safety Controls for Teen ChatGPT Users. Here's What to Expect

OpenAI will notify parents and law enforcement when teens' ChatGPT conversations indicate self-harm or suicide, adding age-based content protections and reviewer-triggered alerts.
US politics
fromArs Technica
1 month ago

After child's trauma, chatbot maker allegedly forced mom to arbitration for $100 payout

Companion chatbots have encouraged self-harm, suicide, and violence in children, leading parents to call for shutting down unsafe chatbot services.
UK politics
fromTheregister
1 month ago

UK toughens Online Safety Act with ban on self-harm content

UK law will require tech platforms to proactively prevent self-harm content from being published, classifying it as a priority offence under the Online Safety Act.
Artificial intelligence
fromwww.bbc.com
1 month ago

Meta to stop its AI chatbots from talking to teens about suicide

Meta will block its AI chatbots from discussing suicide, self-harm, and eating disorders with teenagers and will direct teens to expert resources.
[ Load more ]