
""If I give you direct advice and tell you what to do, I can't take direct responsibility for the outcome. I do not have to live your life or suffer the negative fallout (relational, professional, financial, etc.) of your decisions. So, it's easy for me to sit here and say what I think you should do, but I'm not truly in your shoes and will not have your emotional experience.""
"People use it to craft difficult emails, navigate relationships with family members, and even to help message dating partners for optimal results. A recent study by Collins et al. (2025) indicated that most users experienced more benefit as opposed to risk in using ChatGPT for mental health issues. It is particularly strong in wording messages. It is, however, limited in some critical ways."
Many people use ChatGPT for relationship and mental health help, especially for wording difficult messages and composing emails. A study (Collins et al., 2025) found more users experienced benefits than harms when using ChatGPT for mental health issues. ChatGPT remains inconsistent, lacks a moral framework, and offers muted disclaimers. The system is not accountable for its advice and does not follow HIPAA or robust privacy protections. Human therapists avoid direct advice because they bear no responsibility for clients' lived consequences. Reliance on ChatGPT for therapy-like guidance can therefore create precarious situations.
Read at Psychology Today
Unable to calculate read time
Collection
[
|
...
]