AI's listening gap is fueling bias in jobs, schools and health care
Briefly

AI's listening gap is fueling bias in jobs, schools and health care
"How it works: AI-powered speech recognition systems convert spoken words into text through automatic speech recognition (ASR), which uses acoustic models trained on millions of audio samples. Some companies use AI to transcribe and analyze interview responses, scoring candidates for jobs on clarity, keywords or sentiment. Schools use voice AI for oral reading tests, class captions and language learning. "Ambient" AI tools listen during doctor visits and convert conversations into medical notes. U.S. courtrooms are also using similar systems to transcribe proceedings."
"Sarah Myers West, co-executive director of the AI Now Institute, tells Axios that it can lead to a misdiagnosis or false information in criminal cases. "We're already seeing AI replicate patterns of inequality," she said. "If these systems decide who gets a job interview or access to care, they risk amplifying those same divides." West said these AI systems still mishear people because they're being deployed without proper testing or oversight."
"Zoom out: Allison Koenecke, an assistant professor of Information Science at Cornell Tech, tells Axios there's insufficient awareness of how AI speech models are being applied in "high-stakes domains" such as health care and criminal justice. "At face value, it seems fair because you're using the same speech model for everyone. But if that model is inherently biased, it leads to different outcomes for different people.""
AI-powered speech recognition converts spoken words into text through ASR using acoustic models trained on millions of audio samples. Applications include automated transcription and analysis for job interviews, oral reading tests, classroom captions, language learning, ambient medical note-taking, and courtroom transcription. Various studies show these systems misinterpret speech from some Black speakers and people who do not use standard English, producing errors that can cause misdiagnosis or false legal information. Systemic deployment without adequate testing or oversight allows biased models to produce disparate outcomes. Automated interview scoring platforms used by major companies can influence hiring decisions based on transcriptions and algorithmic scores.
Read at Axios
Unable to calculate read time
[
|
]