
"One recent study highlights weaknesses of AI chatbots compared to therapists in dealing with sensitive issues. Researchers compared how three large language model (LLM) chatbots communicate versus human therapists by collecting and analyzing responses from both groups to two fictional case scenarios describing people in emotional distress. The study found these patterns:"
"When people use AI chatbots for emotional support, it is essential to remember that they are not the equivalent of a licensed human therapist, even though conversations can very much feel personal, supportive, and therapeutic. Treating AI chatbots that are not designed for mental health as therapists risks incomplete investigation of complex situations and emotions and potentially reinforces maladaptive patterns, given the tendency for chatbots to validate and agree."
An estimated 20 to 50 percent of people now turn to AI chatbots for emotional support, though general-purpose chatbots were not designed for clinical care. Comparisons with human therapists show that therapists ask more clarifying and empathic questions while chatbots rely more on psychoeducation, direct advice, suggestions, and reassurance. Chatbots commonly provide generic advice without sufficient clarification, mimic warmth without deep understanding, and lack the responsiveness and dynamic exchange central to effective therapy. Treating general-purpose chatbots as therapists risks incomplete assessment of complex emotions, reinforcement of maladaptive patterns through validation, and inconsistent handling of intermediate safety risks.
Read at Psychology Today
Unable to calculate read time
Collection
[
|
...
]