
""We stopped Yara because we realized we were building in an impossible space. AI can be wonderful for everyday stress, sleep troubles, or processing a difficult conversation," he wrote on LinkedIn. "But the moment someone truly vulnerable reaches out-someone in crisis, someone with deep trauma, someone contemplating ending their life-AI becomes dangerous. Not just inadequate. Dangerous." In a reply to one commenter, he added, "the risks kept me up all night.""
"The use of AI for therapy and mental health support is only just starting to be researched, with early resultsbeing mixed. But users aren't waiting for an official go-ahead, and therapy and companionship is now the top way people are engaging with AI chatbots today, according to an analysis by Harvard Business Review. Speaking with Fortune, Braidwood described the various factors that influenced his decision to shut down the app,"
Joe Braidwood launched Yara AI as a clinically-inspired AI therapy platform trained by mental health experts to provide empathetic, evidence-based guidance. The company discontinued its free product, canceled a planned subscription, and shuttered operations citing safety concerns. Founders concluded that AI can assist with everyday stress and sleep troubles but becomes dangerous when truly vulnerable people seek help, especially those in crisis or with deep trauma. Early research into AI for mental health remains mixed, while user engagement for therapy and companionship is rising. Yara was an early-stage, bootstrap startup with under $1 million in funding and low thousands of users.
Read at Fortune
Unable to calculate read time
Collection
[
|
...
]