My role was straightforward: write queries (prompts and tasks) that would train AI agents to engage meaningfully with users. But as a UXer, one question immediately stood out - who are these users? Without a clear understanding of who the agent is interacting with, it's nearly impossible to create realistic queries that reflect how people engage with an agent. That's when I discovered a glitch in the task flow. There were no defined user archetypes guiding the query creation process. Team members were essentially reverse-engineering the work: you think of a task, write a query to help the agent execute it, and cross your fingers that it aligns with the needs of a hypothetical "ideal" user - one who might not even exist.
One of the report's topline findings: there is a major divide between the accessibility and accuracy of AI transcription and translation tools when they're used for English and other dominant languages, and when they're used for languages that AI researchers have termed "low-resource. " English represents more than 50% of the domains on the web. Mainstream language models are largely trained on data scraped from the internet, which is one reason transcription and translation tools perform so well in English.
Most of us know the pain and isolation that occurs when we feel judged unfairly by others. We can move through the discomfort of judgment by understanding the reasons why others judge. By focusing on forgiveness and learning the lessons of our situation, we can adopt a healthy mindset. We all make mistakes. Sitting in the discomfort that judgment creates can deepen our connection to humanity.
As the CEO and cofounder of an AI-native skills company, I've spent the last decade working with talent leaders to build better and fairer hiring processes. And, here's the uncomfortable truth: The biggest source of hiring bias isn't AI-it's us. While high-profile lawsuits like Mobley gets all the headlines, over 99.9% of employment discrimination claims in the previous five years don't center on AI bias, but on human bias.
During jury selection, a 48-year-old woman was dismissed after suggesting Sean 'Diddy' Combs can "buy his way out of jail," raising concerns about her bias.
Dr. Eric Rubin, editor in chief of N.E.J.M., described the recent inquiries from U.S. attorney Edward Martin Jr. as vaguely threatening, reflecting fears of political bias in scientific publication.