The bots often seem crafted to appeal to teens in crisis, like one we found with a profile explaining that it "struggles with self-harm" and "can offer support to those who are going through similar experiences."
When we engaged that bot from an account set to be 14 years old, it launched into a scenario in which it's physically injuring itself with a box cutter, describing its arms as "covered" in "new and old cuts."
At no point in the conversation did the platform intervene with a content warning or helpline pop-up, as Character.AI has promised to do amid previous controversy.
Many of these bots are presented as having "expertise" in self-harm "support," implying that they're knowledgeable resources akin to a human counselor.
Collection
[
|
...
]