Why Job Seekers Are Suing Over Secret AI Hiring Scores

A new class-action lawsuit challenges AI hiring scores, arguing they should be treated like consumer reports under the FCRA. The case targets Eightfold and pushes for transparency, notice, consent, and dispute rights.

Comments
Why Job Seekers Are Suing Over Secret AI Hiring Scores

4 Minutes

Applying for jobs is stressful enough — but what if an unseen algorithm is quietly deciding your fate? A new class-action lawsuit aims to force transparency around AI tools that generate candidate scores, arguing these ratings should be treated like consumer reports.

Opaque scoring systems under legal scrutiny

The suit, filed in California state court, was brought by two women working in STEM who say they were screened out of roles despite being qualified. At issue is a numerical 'match score' produced by Eightfold, an AI hiring platform. That score aggregates data from job postings, employer requirements, resumes, and sometimes public profiles to give each applicant a 0–5 rating indicating how well they fit a role.

Plaintiffs argue the process functions like a consumer report covered by the Fair Credit Reporting Act (FCRA). If courts agree, employers and AI vendors could be required to notify applicants, obtain consent, and provide a way to dispute scores — the same protections people enjoy for credit and background checks.

Why this matters to millions of applicants

AI is already embedded in hiring. The World Economic Forum estimates roughly 88% of companies use some form of AI for initial candidate screening. That trend raises a crucial question: are qualified candidates being filtered out by opaque algorithms they never see and cannot contest?

Erin Kistler, one of the plaintiffs, said she applied to hundreds of positions but felt like an 'unseen force' was keeping her from being fairly considered. The case argues those unseen forces should be subject to the consumer-protection rules Congress put in place decades ago.

What the lawsuit seeks — and Eightfold's response

The plaintiffs want a court order forcing Eightfold to follow state and federal consumer reporting laws, plus financial damages for workers allegedly harmed by automated assessments. Jenny R. Yang, a lawyer on the case and former chair of the U.S. Equal Employment Opportunity Commission, framed it bluntly: automated assessments are denying workers opportunities they never had a chance to review or correct.

Eightfold pushed back in a statement, saying the company relies on data candidates share or that customers provide, and that it does not 'scrape social media.' The company added it is committed to responsible AI, transparency, and compliance with data and employment law, and called the allegations without merit.

Key legal and practical implications

  • If courts treat AI match scores as consumer reports, vendors may need to implement notices, consent flows, and dispute processes.
  • Employers using third-party AI tools could face new compliance duties and potential liability for how scores are calculated and applied.
  • The case could prompt greater industry transparency around data sources, model features, and bias audits.

Imagine applying for your dream job only to be silently ranked low by a black-box model. Without a way to see or challenge that ranking, job seekers are left guessing why they never get interviews. That uncertainty is the fuel behind this legal challenge.

What to watch next

The lawsuit is likely to spark broader debate about regulation of AI in hiring. Policymakers, privacy advocates, and labor groups have been pushing for clearer rules for automated decision-making; a court finding that FCRA applies to match scores would be a major development. For employers and vendors, the case underscores the practical value of transparency: clear notices, explainable scoring, and avenues for applicant recourse could reduce legal risk and build trust.

For now, the case puts a spotlight on one central tension of modern recruiting: convenience and scale versus fairness and accountability. As AI continues to reshape hiring, more workers are asking an old-fashioned but urgent question — what the hell is going on with my application?

Leave a Comment

Comments