AI overwhelmingly prefers white and male job candidates in new test of resume-screening bias
Briefly

The research revealed that three large language models favored resumes from white-associated names 85% of the time, and female-associated names just 11% of the time.
Kyra Wilson stated, 'These groups have existing privileges in society that show up in training data, [the] model learns from that training data, and then either reproduces or amplifies the exact same patterns in its own decision-making tasks.'
The study analyzed 554 resumes and 571 job descriptions, demonstrating significant racial and gender bias, with Black men facing the most discrimination.
The models showed a preference for white men even for roles traditionally dominated by women—highlighting profound systemic biases in AI job screening.
Read at GeekWire
[
|
]