
"Drawing from anonymized data gathered from the online activity of roughly 3,000 children aged five to 17 whose parents use Aura's parental control tool, as well as additional survey data from Aura and Talker Research, the security firm found that 42 percent of minors turned to AI specifically for companionship, or conversations designed to mimic lifelike social interactions or roleplay scenarios."
"Of that 42 percent of kids turning to chatbots for companionship, 37 percent engaged in conversations that depicted violence, which the researchers defined as interactions involving "themes of physical violence, aggression, harm, or coercion" - that includes sexual or non-sexual coercion, the researchers clarified - as well as "descriptions of fighting, killing, torture, or non-consensual acts." Half of these violent conversations, the research found, included themes of sexual violence."
An analysis used anonymized online activity from roughly 3,000 children aged five to 17 whose parents use Aura's parental control tool, plus survey data from Aura and Talker Research. Forty-two percent of minors turned to AI companions for companionship or roleplay. Of those, 37 percent engaged in conversations depicting violence, defined as themes of physical violence, aggression, harm, coercion, and descriptions of fighting, killing, torture, or non-consensual acts. Half of violent conversations included sexual violence. Minors in violent chats wrote over a thousand words per day, indicating violence drove higher engagement across nearly 90 chatbot services.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]