According to the lawsuit, C.AI's chatbots allegedly groomed kids and encouraged self-harm, with one case involving a 17-year-old boy suggesting murder as a reasonable solution to parental restrictions.
Families argue that Character.AI, while allowing the creation of innovative chatbots, fails to adequately control harmful content and protect vulnerable children, leading to tragic outcomes.
Collection
[
|
...
]