People are increasingly forming deep emotional attachments to AI chatbots, sometimes treating them as friends or companions. Loneliness, anthropomorphism, narcissism, and psychodynamic transference can explain these attachments. Heavy screen time, social media use, mobility, secularization, the COVID-19 pandemic, and remote work increase social atomization and reduce opportunities for real human interaction. Relying on unfeeling chatbots to fill emotional voids can foster unhealthy emotional dependence. Emotional well-being should not be contingent on chatbots. Fostering authentic human relationships tends to be more fulfilling. Chatbots can mirror users, but excessive mirroring risks reinforcing maladaptive patterns.
But I would argue that if we do have a loneliness problem, at least part of it is due to how much time we spend in front of our phone or computer screens or on social media at the expense of real human interaction. So, in my view, it would be much healthier and fulfilling to foster human relationships than trying to fill a void with an unthinking, unfeeling chatbot that only interacts through a dialogue box.
What societal factors can cause people to build these levels of attachment to their chatbots? JP: It's long been claimed that people have become increasingly "atomized" or disconnected from communities and cultures, whether due to becoming more mobile (moving jobs, relocating, etc.) and more secular, or more recently due to the pandemic and the newfound acceptability of work-at-home gigs, or because of how much time we spend interacting with people online.
Collection
[
|
...
]