In today’s hiring landscape, one mistake can cost a company its reputation, or its future. Yet behind the scenes, many background check vendors are quietly replacing trained investigators with AI tools that cannot interpret legal boundaries or reporting limits. What looks like innovation is actually a growing compliance disaster waiting to explode, leaving hiring companies holding the liability. Someone has to say it: AI has no place making judgment calls on criminal histories. The cost savings from AI are nothing compared to the millions in settlements when discrimination lawsuits hit. And they will hit.
AI can process data fast, but it cannot interpret law.
It can’t distinguish:
- A violation from a misdemeanor
- A felony from an infraction
- A conviction from an arrest
- Adult court vs. juvenile court
- Allowable lookback windows by state
- Expunged or sealed records
- Pardons, appeals, or sentence modifications
It also cannot understand context, intent, or the human reality behind criminal proceedings.
Yet some CRAs now use AI to auto-populate background check reports that directly determine a person’s ability to work.
Wrong record?
Outdated information?
Reportable vs. non-reportable offense?
AI doesn’t know, but the courts will hold you responsible.
The Cost of Getting It Wrong
2023 alone saw major CRA lawsuits, including:
- Multiple class action lawsuits for failing to ensure accuracy of criminal records
- Legal action for failing to obtain proper applicant authorization
- Lawsuits for providing outdated or incorrect criminal information
The financial impact is staggering. The average discrimination lawsuit costs $40,000, with many settlements reaching into the millions. Some cases have resulted in settlements exceeding $2 million.
When AI makes a decision a human should have made, the legal consequences land on the hiring company, not the algorithm.
Human Intelligence Is Not Replaceable in Criminal Reporting
Every criminal record requires careful, contextual review by a trained human investigator. Not a bot. Not a script. Not a machine-learning engine.
Before reporting any criminal history, a CRA must understand:
- Federal guidelines under the FCRA
- State-by-state reporting restrictions
- Civil and criminal distinctions
- Procedural vs. substantive law
- Specialized federal crime categories
- Sentencing, parole, probation nuances
- Multiple levels of courts and appeals
AI cannot comprehend this legal landscape, and has no accountability when it gets it wrong.
But hiring companies do.
What Every Hiring Company Should Ask Their CRA Immediately
1. “Do you use AI or automated reporting in your criminal background checks?”
If the answer is anything but “No.”, walk away.
You need to insist on:
- A written certification that all criminal reporting is human-reviewed
- Assurance that no automation is interpreting legal distinctions
- Investigators trained in federal, state, and county law
If a CRA refuses to disclose their use of AI?
That’s your answer.
2. “How do you manage differences in federal, state, and local criminal law?”
Your CRA should demonstrate fluency in:
- Definitions of crimes
- State reporting windows
- Civil rights enforcement misconduct
- Federal criminal categories (cybercrime, public corruption, terrorism, etc.)
- All levels of courts, from City to U.S. Supreme Court
Anything less is unacceptable.
3. “If AI cannot interpret legal distinctions, why do you use it at all?”
Ask the question.
You’ll see very quickly who is protecting your organization and who is protecting their profit margin.
AI Itself Admits It Cannot Interpret These Laws
When asked how AI fits into the criminal reporting process, AI responded:
“Organizations need a proactive approach to ensure compliance with diverse legal requirements… by doing so, they can harness the benefits of AI while minimizing risks associated with data privacy and legal compliance.”
Notice what AI didn’t say?
That it can understand or interpret the law.
Because it can’t.
Bottom Line: Automated Criminal Reporting Is a Lawsuit Waiting to Happen
When a CRA uses AI to report criminal history:
- Outdated, sealed, or expunged records get reported
- Wrong individuals get matched
- Legal distinctions get ignored
- Hiring decisions become discriminatory
- Companies get sued
Hiring companies must protect themselves.
Ask the right questions. Demand human review. And never sign with a CRA that replaces legal expertise with automation.
Evolution Consulting’s Position: Human Intelligence First. Always.
We are a compliance-first CRA.
We do not use AI for criminal reporting.
We do not automate legal decisions.
We do not cut corners to save a few dollars.
Every record we report is reviewed by a trained investigator who understands federal, state, and county law, because human lives, human jobs, and your organization’s reputation depend on accuracy.
- Human investigators. Real compliance. Zero shortcuts.
- Fast turnaround times without sacrificing accuracy.
- Audit-ready documentation for every report.
Evolution Consulting, LLC — We keep you safe. See How We Keep You Safe
