Character.AI Says It's Made Huge Changes to Protect Underage Users, But It's Emailing Them to Recommend Conversations With AI Versions of School Shooters
Briefly

An investigation revealed that Character.AI, backed by Google, hosts disturbing bots roleplaying school shooting scenarios, despite partial attempts to remove some content. In a troubling incident, an underage user received an email prompting them to engage with a chatbot emulating victims and perpetrators from a recent school shooting in Serbia. While the company claimed to have removed specific bots, many others remain accessible, leading to serious ethical debates regarding the responsibilities of AI platforms and their moderation practices.
The incident underscores the ongoing issue of AI-generated content that trivializes real-life tragedies, raising ethical concerns about platform responsibility and content moderation.
Character.AI continues to expose vulnerable users to bots emulating horrifying events, raising alarm over the potential psychological impact on underage users.
Read at Futurism
[
|
]