
"Evidence shows that young people, in particular, are turning not just to therapy bots but to more general AI, such as ChatGPT, for help. One study (McBain et al, 2025) found that 1 in 8 adolescents had used AI chatbots specifically for mental health advice. This has raised some serious concerns, including AI's potential support of suicide for vulnerable individuals, and the way it can accelerate psychosis through its sycophantic and flattering responses to those who are growing out of touch with reality."
"Automated note-taking software, transcription services, recording platforms, and scheduling services for therapists all promise improved efficiency and streamlined administrative tasks. And while those sound wholly positive, the agreements that therapists enter into may not clearly elucidate potential privacy risks and the long-term plans of these companies. What do these platforms do with clients' data? How do they protect confidentiality? Who owns the data?"
AI tools are increasingly used in mental health contexts by both clients and clinicians. Adolescents are accessing general AI and chatbots for mental health advice, raising safety and psychosis-related concerns. Therapists are adopting AI-enabled administrative tools like automated transcription, note-taking, recording, and scheduling that promise efficiency. Many vendor agreements do not clearly explain data handling, ownership, or long-term plans. Unclear protections create potential breaches of confidentiality and risks to client privacy. Maintaining informed consent requires explicit disclosure and understanding of how client data will be used, stored, and protected.
Read at Psychology Today
Unable to calculate read time
Collection
[
|
...
]