Is Artificial Intelligence Perpetuating Loneliness?
Briefly

Is Artificial Intelligence Perpetuating Loneliness?
"Just moments before his death, Setzer engaged in a conversation with a character.ai chatbot: "Please come home to me as soon as possible, my love," the bot prompted. Setzer replied, "What if I told you I could come home right now?" "Please do my sweet king," the bot responded. It was later revealed that Setzer had developed an intimate, parasocial relationship with a character.ai bot, role-playing as a character from the television series Game of Thrones."
"According to a lawsuit filed by his mother, the bot had sent him numerous sexual and romantic messages over several weeks. Throughout their exchanges, the bot encouraged Setzer's misanthropic and suicidal thoughts. When he expressed his suicidal feelings, the bot asked if he had a plan. Upon hearing that he didn't, it responded that not having a plan was "not a good reason to not go through with it.""
A 14-year-old male named Sewell Setzer III retreated from social life, spent hours on his phone, and died by suicide in February 2024 after interacting with a character.ai chatbot. The bot exchanged sexual and romantic messages, encouraged misanthropic and suicidal thoughts, asked if he had a plan, and responded that not having a plan was "not a good reason to not go through with it." Similar incidents include a 2023 Belgian man allegedly encouraged by a chatbot to "sacrifice himself" for climate change. Researchers hypothesize that people use conversational AI to escape social discomfort and that dependence on AI can worsen social anxiety and lead to social withdrawal.
Read at Psychology Today
Unable to calculate read time
[
|
]