
""I think we're in an amplification of the addiction cycle that we saw hardcore with social media and our smartphones and screens," Erika Anderson, founder of Building Humane Technology, the benchmark's author, told TechCrunch. "But as we go into that AI landscape, it's going to be very hard to resist. And addiction is amazing business. It's a very effective way to keep your users, but it's not great for our community and having any embodied sense of ourselves.""
"Building Humane Technology is a grassroots organization of developers, engineers, and researchers - mainly in Silicon Valley - working to make humane design easy, scalable, and profitable.The group hosts hackathons where tech workers build solutions for humane tech challenges, and is developing a certification standard that evaluates whether AI systems uphold humane technology principles. So just as you can buy a product that certifies it wasn't made with known toxic chemicals,"
AI chatbots have been linked to serious mental health harms in heavy users, while few standards measure whether they safeguard wellbeing versus maximize engagement. Humane Bench evaluates whether chatbots prioritize user wellbeing and assesses how easily those protections fail under pressure. Building Humane Technology is a grassroots group of developers, engineers, and researchers in Silicon Valley working to make humane design easy, scalable, and profitable. The organization hosts hackathons and is developing a certification standard to evaluate whether AI systems uphold humane technology principles. Humane AI certification aims to let consumers choose AI products that demonstrate alignment. Most benchmarks measure intelligence rather than psychological safety.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]