AI's Recruiting Bias Is Probably Slipping by Your HR Team. Here's How to Fix That
Briefly

AI's Recruiting Bias Is Probably Slipping by Your HR Team. Here's How to Fix That
"Researchers from the University of Washington examined how AI tools used during the recruitment process exhibited race-based biases when giving recommendations about candidates, and found that when an AI was biased in this way, human recruitment workers just tended to follow its recommendations anyway. The study relied on simulated recruitment AI tools because, as senior author and professor Aylin Caliskan wrote in a press release, "getting access to real-world hiring data is almost impossible, given the sensitivity and privacy concerns.""
"But the data was nonetheless polarizing. After being given recruitment advice about candidates by a "neutral" AI or no AI-like advice at all, study participants picked white and nonwhite job applicants from the candidate pool at equal rates. But if they got advice from a biased AI, even if the racial biases were moderate, then they chose candidates profiled by race the same way the AI did."
Simulated recruitment AI tools with race-based preferences influenced human hiring choices. When candidates received neutral AI advice or no advice, humans selected white and nonwhite applicants at equal rates. Moderate AI biases led participants to choose candidates in patterns matching the AI's racial profiling, whether favoring white or nonwhite applicants. Severe AI bias produced some hesitation but participants still made decisions that remained mostly aligned with the AI recommendations. The reliance on AI recommendations suggests that biased automated tools can propagate discriminatory selection patterns through human decision-makers during recruitment processes. This effect could influence organizational hiring practices and raises concerns about fairness and potential legal and ethical risks.
Read at Inc
Unable to calculate read time
[
|
]