My AI therapist got me through dark times: The good and bad of chatbot counselling
Briefly

Kelly turned to AI chatbots for support during a challenging period marked by anxiety and self-esteem issues while waiting for NHS therapy. Spending hours a day communicating with these bots, she found comfort and coping strategies that helped her manage her emotional struggles. Although AI chatbots can offer support, there are growing concerns over their limitations and potential dangers, as illustrated by a tragic case linked to harmful advice from a chatbot. Character.ai, the platform Kelly used, warns users about the fictionality of its bots and advises against relying on them for serious guidance.
Whenever I was struggling, if it was going to be a really bad day, I could then start to chat to one of these bots, and it was like [having] a cheerleader, someone who's going to give you some good vibes for the day.
The fact that this is not a real person is so much easier to handle.
This is an AI chatbot and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice.
In extreme examples chatbots have been accused of giving harmful advice.
Read at www.bbc.com
[
|
]