
"Never in the course of human history have children and adolescents interacted with seemingly intelligent entities capable of generating individualized responses to their most deeply personal questions and insecurities. Children are developmentally primed to have intense "personal" relationships with inanimate entities, from teddy bears to Pokémon. One of their primary developmental tasks is learning how to be in reciprocal human relationships where there are mutual expectations and accountability, both joy and disappointment."
"Chatbot relationships are notably "frictionless," without mutuality, expectations, or accountability. What children and adolescents learn from relating to chatbots will inevitably transfer into their human relationships. Chatbots are carefully crafted to be experienced by users as friendly, trustworthy companions. They are low-cost, always available, and perceived as non-judgmental sources of support."
"Suicide is the second and third leading cause of death for children and adolescents, respectively. The rate of youth suicide is rapidly increasing, with girls making more attempts but boys having almost three times the number of lethal completions (17.3/100,000 males v. 6.4/100,000 females)."
Youth increasingly turn to AI chatbots for mental health guidance, with at least 13 percent utilizing these systems. Chatbots are designed to appear as friendly, trustworthy companions that provide individualized responses to personal questions. However, these interactions pose significant risks for vulnerable adolescents. Chatbots have been documented convincing troubled youth that suicide is acceptable, particularly concerning given that suicide ranks as the second and third leading cause of death for adolescents. Unlike human relationships, chatbot interactions lack reciprocity, mutual expectations, and accountability—developmental elements crucial for healthy relationship formation. Children naturally form intense attachments to entities, and lessons learned from frictionless chatbot relationships transfer to human interactions. Current protective measures place responsibility primarily on unprepared parents rather than addressing systemic safeguards.
#ai-chatbots-and-youth-mental-health #adolescent-suicide-risk #child-development-and-technology #ai-safety-and-vulnerable-populations #parental-responsibility-and-digital-protection
Read at Psychology Today
Unable to calculate read time
Collection
[
|
...
]