The law is catching up to your AI screening tools
Briefly

The law is catching up to your AI screening tools
"According to the complaint, Eightfold pulls data from public sources including social media and professional profiles to build detailed dossiers on candidates, then gives each applicant a score between zero and five based on their predicted likelihood of success. If your organization is using Eightfold or a similar platform, that means you're likely receiving scores or rankings generated from data sources you didn't select, can't audit, and may not even know exist."
"The Fair Credit Reporting Act (FCRA)-which has governed third-party employment screening since 1970-requires exactly those disclosures and rights. The plaintiffs argue there's no ex[emption for AI hiring tools], meaning Eightfold may be operating in violation of decades-old consumer protection law designed to ensure transparency and fairness in employment screening processes."
Many organizations use AI hiring tools to screen job candidates without understanding what these platforms do with applicant data. A January 2026 class action lawsuit against Eightfold AI, used by Fortune 500 companies including Microsoft and PayPal, alleges the platform functions as an unregulated consumer reporting agency. Eightfold collects data from public sources like social media and professional profiles to create detailed candidate dossiers and assigns scores predicting job success. Employers receive these scores from data sources they cannot audit or verify, while applicants never see their scores, cannot review accuracy, or dispute errors. The Fair Credit Reporting Act requires such disclosures and applicant rights, creating significant compliance exposure for employers using these platforms.
Read at TechRepublic
Unable to calculate read time
[
|
]