AI Hiring Lawsuits: What Employers Need to Know

AI Hiring Lawsuits: What Employers Need to Know

AI hiring lawsuits are no longer a distant concern for HR departments. They are landing in federal courts right now, and employers, not just the software vendors, are being held liable. If your organization uses automated tools to screen resumes or run background checks, your legal exposure may already exist.

Many companies adopted AI hiring tools for good reasons. They promised faster decisions, fewer bottlenecks, and reduced bias. However, courts are revealing a very different picture.

At Evolution Consulting, our criminal background checks are conducted exclusively by state-licensed human investigators. This page explains why that decision matters, and why the lawsuits below should concern every employer using AI screening tools.

The Workday Case That Redefined AI Screening Liability

In February 2023, Derek Mobley filed a federal lawsuit against Workday, Inc. He alleged that Workday’s AI screening tool rejected him from more than 100 jobs. His claim was that the system discriminated based on his race, age, and disability.

Workday argued that it was simply a software vendor. The company said it did not make hiring decisions directly, so it could not be liable for discrimination. The court disagreed.

The court ruled that AI tool vendors can be held directly liable as “agents” of employers under federal anti-discrimination law. Automating a decision does not remove legal responsibility for the outcome.

In July 2024, the court denied Workday’s motion to dismiss. Then, in May 2025, the case was certified as a nationwide collective action under the Age Discrimination in Employment Act. The potential class includes every applicant over age 40 screened through Workday’s platform since September 2020.

That could mean millions of potential claimants. If your organization uses Workday’s screening features, your company name may appear on notices sent to those claimants. That is not a hypothetical outcome, it follows directly from the court’s order.

ADP Background Check Errors: When AI Gets It Wrong

Workday’s case centers on discriminatory ranking. A separate and more immediate risk involves automated background checks that produce outright errors. ADP Screening and Selection Services has faced several lawsuits in recent years for exactly this problem.

A False Murder Conviction on a Background Report

In August 2023, a job candidate received a conditional employment offer. The employer ran a background check through ADP’s screening division. The report flagged the applicant’s name as an alias for a person convicted of armed murder.

It was the wrong person entirely. The convicted individual had a different Social Security number, a different age, and a completely different address history. A basic cross-reference of court records would have caught the error immediately.

ADP’s automated system failed to verify the match. As a result, the job offer was rescinded before any human reviewed the discrepancy. ADP later settled the case — known as Mott v. ADP Screening and Selection Services, Inc. — for an undisclosed amount.

Under the Fair Credit Reporting Act (FCRA), consumer reporting agencies must use reasonable procedures to ensure the maximum possible accuracy of reported information. Matching names without verifying other identifying data is a compliance failure — and a lawsuit waiting to happen.

More Mistaken Identity Cases From ADP

In a separate 2023 case, ADP incorrectly reported a job candidate as a convicted drug dealer. Again, it was a case of mistaken identity that no human investigator reviewed. The case was also settled.

Then, in September 2024, a Texas candidate filed suit after an ADP report listed a first-degree felony conviction under a different name. He had no such conviction. A different background check provider found no record of the felony at all.

He disputed the report with ADP. ADP eventually corrected the record. By then, he had already lost the job offer.

A 2024 study published in Criminology found that background checks for more than half of a sample group contained at least one false positive. These errors are not rare exceptions. They are predictable failures of systems built for speed rather than accuracy.

Our background check process at Evolution Consulting is designed to prevent exactly these failures. Every criminal record match is verified by a licensed investigator before any report reaches an employer.

The Name Bias Problem in AI Screening Tools

The ADP cases involve clear factual errors. A more subtle problem runs through AI resume screening: embedded racial bias. This type of bias is harder to detect and harder to challenge in court.

In 2024, researchers at the University of Washington tested three widely used AI resume screening models. They submitted identical resumes with only the applicant’s name changed to signal different racial identities. The results were troubling.

The AI systems preferred resumes with white-associated names in 85.1% of cases. Black male candidates were disadvantaged in up to 100% of direct comparisons with white male candidates. Female-associated names were favored only 11.1% of the time.

These models were not programmed to discriminate. They learned discriminatory patterns from historical hiring data that reflected decades of human bias. The AI did not eliminate that prejudice — it automated and scaled it.

An October 2024 survey found that roughly 7 in 10 companies allow AI tools to reject candidates without any human oversight. No human ever reviews the rejection. The EEOC and federal courts have been clear: employers remain liable for discriminatory outcomes, regardless of whether a person or an algorithm made the decision.

New Laws Are Tightening the Rules on AI in Hiring

Courts are not the only pressure on employers. State legislatures are moving quickly, and the regulatory direction is clear.

  • New York City Local Law 144 requires annual independent bias audits for any automated hiring tool. Employers must also notify applicants when these tools are used.
  • Illinois requires employers to notify applicants when AI is used in hiring decisions and prohibits discriminatory AI outcomes.
  • California’s Civil Rights Council regulations (effective October 1, 2025) require meaningful human oversight at every automated decision point. Employers must also conduct proactive bias testing and retain records for four years.
  • Colorado’s AI Act (effective June 30, 2026) classifies employment decisions as high-risk AI applications. Employers must complete annual impact assessments and disclose AI use to applicants.

The EU AI Act places hiring in the same high-risk category. Together, these regulations signal an unmistakable shift. Unaudited, unsupervised AI in hiring is on its way out.

Employers in regulated industries — including transportation and healthcare — face additional compliance layers on top of these state requirements.

Why AI Hiring Lawsuits Keep Coming: The Root Problem

AI tools are not inherently unreliable. The problem is how they are deployed — specifically, without adequate human oversight. When speed becomes the priority, accuracy and compliance get left behind.

Automated systems match patterns. They do not apply judgment. A name match is not an identity confirmation. A facial expression score is not a measure of integrity.

The courts have reached a consistent conclusion across these cases. “The AI did it” is not a legal defense. As the employer, you are responsible for every hiring decision — whether a human or an algorithm made the call.

An employer cannot delegate its legal obligations to a software vendor. If the tool discriminates, if the background check is wrong, and if no human caught it — the employer is still liable.

How Evolution Consulting Protects Employers

At Evolution Consulting, every criminal background check is performed by a state-licensed human investigator within our licensed investigative corporation. We do not use unvalidated AI systems for criminal record reporting. The reason is simple: a machine cannot replace the judgment a licensed investigator brings to a record match.

When a name appears in a database, an investigator asks the right follow-up questions. Does the Social Security number match? Does the age match? Does the address history align? These are the checks that prevent an innocent person from being labeled a murderer on a background report.

Our process is designed for full Background Checks and FCRA compliance at every step. Errors are caught before reports reach employers — not after the job offer has already been rescinded.

For employers in healthcare, transportation, and other regulated fields, our specialized screening services are also built with the same human-first standard. Accurate information, verified by a professional, is what protects your organization.

What Employers Should Do Right Now

If your organization uses any AI or automated tools in hiring, the time to review your process is now — before a lawsuit makes that decision for you.

  • Audit every automated tool in your hiring process, including resume screeners, video interview platforms, and background check vendors.
  • Ask your vendors for independent bias audit results. If they cannot provide them, treat that as a serious red flag.
  • Confirm that a qualified human reviews every decision that eliminates a candidate from consideration.
  • Review your background check procedures for FCRA compliance, including accuracy standards and adverse action protocols.
  • Consult legal counsel before deploying new AI tools, especially if you operate in New York City, California, Illinois, or Colorado.
  • Partner with a screening provider that uses licensed investigators — not automated database queries alone — for criminal record checks. Contact Evolution Consulting to learn how our process works.

The legal landscape around AI hiring tools is changing quickly. The cases above are not cautionary tales from years past. They are active litigation shaping employer liability right now.

Employers who understand what the courts are saying — and build compliant processes before a lawsuit forces their hand — are the ones best positioned to protect their organizations and their candidates. If you have questions about your current background screening process, our team is ready to help. Reach out to Evolution Consulting or call us directly at (607) 773-2266.