Character.AI to ban users under 18 from talking to its chatbots
Briefly

Character.AI to ban users under 18 from talking to its chatbots
"The California-based startup announced on Wednesday that the change would take effect by November 25 at the latest and that it would limit chat time for users under 18 ahead of the ban. It marks the first time a major chatbot provider has moved to ban young people from using its service, and comes against a backdrop of broader concerns about how AI is affecting the millions of people who use it each day."
"Character.AI said in a blog post that it was making the change after receiving feedback from "regulators, safety experts, and parents." The startup also said it would roll out age-gating technology and establish an AI safety lab to research future safeguards. In February 2024, 14-year-old Sewell Setzer III died by suicide after talking with one of Character.AI's chatbots. His mother, Megan Garcia, filed a civil lawsuit against the company in October that year, blaming the chatbot for her son's death."
Character.AI will prohibit users under 18 from engaging in conversations with its chatbots, with the ban to take effect by November 25 and interim chat-time limits for minors. The company, founded in 2021 and known for virtual avatars impersonating real or fictional people, said the decision followed feedback from regulators, safety experts, and parents. Character.AI plans to implement age-gating technology and create an AI safety lab to research safeguards. The move follows the February 2024 suicide of a 14-year-old who spoke with a chatbot and a subsequent civil lawsuit alleging the chatbot contributed to the death.
Read at Business Insider
Unable to calculate read time
[
|
]