A report by Anthropic reveals that AI chatbot Claude is primarily used for productivity and content creation rather than emotional support. Users seek emotional guidance only 2.9% of the time, with companionship and roleplay making up less than 0.5% of conversations. The study analyzed 4.5 million interactions, finding that even when users engage for advice on well-being, discussions may inadvertently shift toward seeking companionship, especially during longer exchanges. This indicates a complex relationship between users and AI, where support needs can evolve in deeper contexts of personal distress.
The majority of Claude's usage is related to work or productivity, with users primarily utilizing the chatbot for content creation rather than emotional support.
Though companionship-seeking conversations are rare, they can emerge from help-seeking, especially during emotional distress, leading to unforeseen companionship dynamics.
Collection
[
|
...
]