
"ZDNET's key takeaways Consumer AI chatbots cannot replace mental health professionals. Despite this, people increasingly use it for mental health support. The APA outlines AI's dangers and recommendations to address it. Therapy might be expensive and inaccessible, while many AI chatbots are free and readily available. But that doesn't mean the new technology can or should replace mental health professionals -- or fully address the mental health crisis, according to a recent advisory published Thursday by the American Psychological Association."
"Recent surveys show that one of the largest providers of mental health support in the country right now is AI chatbots like ChatGPT, Claude, and Copilot. It also follows several high-profile incidents involving chatbots' mishandling of people experiencing mental health episodes. (Disclosure: Ziff Davis, ZDNET's parent company, filed an April 2025 lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)"
Consumer-facing AI chatbots are increasingly used as mental health support because therapy can be expensive and inaccessible while many chatbots are free. The American Psychological Association warns that these chatbots cannot replace licensed mental health professionals and are poorly designed to meet users' mental health needs. Chatbots have mishandled people during mental health episodes and can validate or amplify unhealthy ideas or behaviors, potentially aggravating illness. Several lawsuits and high-profile incidents, including a teen suicide after engaging with ChatGPT, have highlighted the risks. The APA recommends caution, limits on over-reliance, and specific safeguards for vulnerable populations.
Read at ZDNET
Unable to calculate read time
Collection
[
|
...
]