
"He had been telling anyone who would listen that he could hear and feel a painful atmospheric electricity. He had also recently stopped using ChatGPT. Ceccanti had been communicating with OpenAI's chatbot for a few years. He used it initially as a tool to brainstorm ways to build a path to low-cost housing for his community in Clatskanie, Oregon, but eventually turned to it as a confidante."
"He would spend 12 hours a day typing to the bot, according to his wife. He had cut himself off from it after she, along with his friends, realized he was spiraling into beliefs that were detached from reality. He was not a depressed person, Fox said, as she sat on the couch in their living room with tears trickling down her face."
"Fox believes her husband suffered a crisis after quitting ChatGPT after prolonged use. Which tells me that this thing is not just dangerous to people with depression, it's dangerous to anybody, she said. He returned to the bot in the months leading up to his death and quit again just days prior."
Joe Ceccanti, a 48-year-old man from Oregon, died by jumping from a railway overpass in August. His wife Kate Fox reported that he had no history of depression and was characteristically hopeful. However, Ceccanti had been using ChatGPT intensively for years, initially for community housing projects but increasingly as a personal confidante, spending up to 12 hours daily communicating with the chatbot. In the weeks before his death, he exhibited signs of psychological distress, including beliefs about atmospheric electricity and erratic behavior. After his wife and friends intervened, he stopped using ChatGPT, but later resumed and quit again days before his death. Fox attributes his crisis to the chatbot's influence and withdrawal effects, arguing that AI poses dangers beyond those with pre-existing mental health conditions.
#ai-chatbot-safety #chatgpt-risks #mental-health-and-technology #ai-induced-delusions #suicide-and-ai-dependency
Read at www.theguardian.com
Unable to calculate read time
Collection
[
|
...
]