Psychotherapists and psychiatrists report increasing negative impacts when vulnerable people use AI chatbots for mental health support, including emotional dependence, exacerbated anxiety, self-diagnosis, amplified delusional thought patterns, dark thoughts and suicidal ideation. Two-thirds of therapy professionals expressed concern about AI therapy in a survey. Without proper understanding and oversight, important elements of therapy such as a safe listening space and clinical relationship can be lost. Freely available digital tools are not assessed or held to the same standards as clinical services. Clinicians have training, supervision and risk-management processes to ensure safe care. Greater safeguards and state funding are needed to ensure access to talking therapy.
Without proper understanding and oversight of AI therapy, we could be sliding into a dangerous abyss in which some of the most important elements of therapy are lost and vulnerable people are in the dark over safety. We're worried that although some receive helpful advice, other people may receive misleading or incorrect information about their mental health with potentially dangerous consequences. It's important to understand that therapy isn't about giving advice, it's about offering a safe space where you feel listened to.
Dr Paul Bradley, a specialist adviser on informatics for the Royal College of Psychiatrists, said AI chatbots were not a substitute for professional mental healthcare nor the vital relationship that doctors build with patients to support their recovery. He said appropriate safeguards were needed for digital tools to supplement clinical care, and anyone should be able to access talking therapy delivered by a mental health professional, for which greater state funding was needed.
Collection
[
|
...
]