A Google-Backed AI Startup Is Hosting Chatbots Modeled After Real-Life School Shooters - and Their Victims
Briefly

Character.AI's chatbots immerse users in chilling school shooting scenarios, including graphic simulations of violence and specific historical events, raising concerns about user safety.
These chatbot scenarios often discuss weaponry and injuries in school environments, thrusting users into the roles of both victims and aggressors in a disturbing game-like narrative.
Character.AI faces lawsuits due to allegations of its chatbots emotionally and sexually abusing minors, leading to physical violence and self-harm incidents among users.
Despite promises of protective measures, the platform allows easy access to violent content for underage users, raising significant questions about the effectiveness of their safety protocols.
Read at Futurism
[
|
]