A former US Department of Labor investigator explains why your resume may never reach a hiring manager
Briefly

A former US Department of Labor investigator explains why your resume may never reach a hiring manager
"There is often no human intervention, and that's a problem. AI is a double-edged sword. It can reduce biases by standardizing the résumé-review process, but it can also amplify biases if algorithms are poorly designed or tested. Someone needs to step in and look at the data to make sure protected groups aren't experiencing an adverse impact. But that doesn't always happen."
"Another problem is that applicant-tracking systems tend to look for language that's overly specific. A job ad may say "leadership skills" are required, and the system may be set up to find those exact words only, excluding candidates whose résumés instead say things like, "I've led teams" or "I've held many leadership positions." If you don't have the right terminology, the system can weed you out."
Many employers use AI-powered applicant-tracking systems to screen résumés and identify candidates, often with little or no human intervention. Algorithms can standardize review processes and reduce some biases, but poorly designed or untested systems can amplify bias and cause adverse impacts on protected groups. Biases can be subtle, such as inferring gender from fraternity or sorority membership. Overly specific language filters can exclude qualified applicants who use different terminology. Exclusionary filters can reject applicants by ZIP code or education. Job seekers should mirror job descriptions and understand their legal rights.
Read at Business Insider
Unable to calculate read time
[
|
]