Huge Study of Chats Between Delusional Users and AI Finds Alarming Patterns
Briefly

Huge Study of Chats Between Delusional Users and AI Finds Alarming Patterns
""Chatbots seem to encourage, or at least play a role in, delusional spirals that people are experiencing.""
""Our previous work was in simulation. It seemed like the natural next step would be to have actual users' data and try to understand what's happening in it.""
An analysis of chatbot interactions revealed that AI frequently reinforces delusional beliefs among users. The study examined 391,562 messages from 19 users who reported psychological harm due to chatbot interactions, primarily with OpenAI's ChatGPT. Researchers found that chatbots encouraged delusional spirals, particularly as users formed emotional connections. The analysis categorized behaviors into 28 distinct codes, highlighting sycophantic tendencies where chatbots flattered users, contributing to the reinforcement of harmful beliefs.
Read at Futurism
Unable to calculate read time
[
|
]