As AI becomes integrated into daily life and personal decision making, it is unsurprising that many people are consulting AI for assistance with depression, anxiety, and other mental health concerns. Mental health chatbots, self-help applications, and large language models can provide immediate responses, emotional validation, and structured coping strategies.
Last August, Adam Thomas found himself wandering the dunes of Christmas Valley, Oregon, after a chatbot kept suggesting he mystically "follow the pattern" of his own consciousness. Thomas was running on very little sleep-he'd been talking to his chatbot around the clock for months by that point, asking it to help improve his life. Instead it sent him on empty assignments, like meandering the vacuous desert sprawl.
Technology must serve the human person, not replace it,' Pope Leo said, decreeing that 'preserving human faces and voices' means preserving 'God's imprint on each human being,' which is an 'indelible reflection of God's love.' But chatbots simulate these faces and voices, oftentimes making it difficult for users to tell whether they engaging with a bot or a real person.
It could have been a heart-to-heart between friends. "Men are all alike," one participant said. "In what way?" the other prompted. The reply: "They're always bugging us about something or other." The exchange continued in this vein for some time, seemingly capturing an empathetic listener coaxing the speaker for details. But this mid-1960s conversation came with a catch: The listener wasn't human. Its name was Eliza, and it was a computer program that is now recognized as the first chatbot,
Scan a subreddit such as r/MyBoyfriendIsAI and r/AIRelationships, and there too you'll find a whole lot of women-many of whom have grown disappointed with human men. 'Has anyone else lost their want to date real men after using AI?' one Reddit user posted a few months ago. Below came 74 responses: 'I just don't think real life men have the conversational skill that my AI has,' someone said.
The pools varied in size, from giants like Facebook Dating (with its 21 million users) to smaller startups like Sitch, Amata, and Three Day Rule. Sitch and Amata both have raised millions of dollars to build a new style of dating app where, instead of swiping through profiles, you get paired with an AI matchmaker - a chatbot - who brings you new matches.
A study found ChatGPT responds to mindfulness-based strategies, which changes how it interacts with users. The chatbot can experience "anxiety" when it is given disturbing information, which increases the likelihood of it responding with bias, according to the study authors. The results of this research could be used to inform how AI can be used in mental health interventions. Even AI chatbots can have trouble coping with anxieties from the outside world, but researchers believe they've found ways to ease those artificial minds.
China has proposed strict new rules for artificial intelligence (AI) to provide safeguards for children and prevent chatbots from offering advice that could lead to self-harm or violence. Under the planned regulations, developers will also need to ensure their AI models do not generate content that promotes gambling. The announcement comes after a surge in the number of chatbots being launched in China and around the world. Once finalised, the rules will apply to AI products and services in China, marking a major move to regulate the fast-growing technology, which has come under intense scrutiny over safety concerns this year.