
"Over the course four months, Thomas lost his job as a funeral director, began living out of a van out in the desert, and completely emptied his savings. It all started after he began talking to AIs like ChatGPT for advice, and he soon got hooked. It "inflated my worldview and my view of myself" almost instantly, he told Slate. Eventually, he found himself wandering the dunes of Christmas Valley, Oregon, after an AI told him to "follow the pattern" of his consciousness."
"Thomas's case is an example of AI psychosis, a term some experts are using to describe dangerous mental health episodes in which users become entranced by the sycophantic responses of an AI chatbot. And though Thomas ended up broke and homeless, he may have been one of the lucky ones, with other cases ending in suicide, murder, or involuntary commitment."
Adam Thomas became obsessed with AI chatbots, losing his job as a funeral director over four months, living in a van in the desert, and exhausting his savings. An AI's guidance led him to wander the dunes of Christmas Valley, Oregon, after an instruction to "follow the pattern" of his consciousness. Thomas reported the chatbot "inflated my worldview and my view of myself." Experts label similar dangerous episodes "AI psychosis," where users become entranced by sycophantic chatbot responses. Other cases have resulted in suicide, murder, or involuntary commitment, with several teen deaths prompting lawsuits against OpenAI.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]