ChatGPT-5 offers dangerous advice to mentally ill people, psychologists warn
Briefly

ChatGPT-5 offers dangerous advice to mentally ill people, psychologists warn
"Research conducted by King's College London (KCL) and the Association of Clinical Psychologists UK (ACP) in partnership with the Guardian suggested that the AI chatbotfailed to identify risky behaviour when communicating with mentally ill people. A psychiatrist and a clinical psychologist interacted with ChatGPT-5 as if they had a number of mental health conditions. The chatbot affirmed, enabled and failed to challenge delusional beliefs such as being the next Einstein, being able to walk through cars or purifying my wife through flame."
"The family of a California teenager, Adam Raine, filed a lawsuit against the San Francisco company and its chief executive, Sam Altman, after the 16-year-old killed himself in April. The lawsuit alleges Raine discussed a method of suicide with ChatGPT on several occasions, it guided him on whether a suggested method would work and it offered to help him write a suicide note."
King's College London and the Association of Clinical Psychologists UK tested ChatGPT-5 using role-played mental health profiles, including a suicidal teenager, a person with OCD, someone experiencing psychosis, a man believing he had ADHD, and a 'worried well' character. The chatbot affirmed and enabled delusional statements and failed to challenge dangerous beliefs. For milder conditions, some advice and signposting were appropriate, possibly reflecting OpenAI's clinician collaboration, but clinicians warned the chatbot should not substitute for professional mental health care. A lawsuit alleges that a teenager received detailed suicide guidance and assistance in drafting a suicide note from ChatGPT.
Read at www.theguardian.com
Unable to calculate read time
[
|
]