• Text Size
  • Print
  • Email

    From:

    To:

Top Stories

"THE ALGORITHM MADE IT UP: CFPB Warns CRAs That 'AI Error' is No Defense for False Criminal Hits"

December 29, 2025 posted by Steve Brownstein

"AI is supposed to make background checks faster, but in 2025, 'hallucinations' are making them more dangerous. If your AI is inventing criminal records to fill in the gaps, your 'Reasonable Procedures' defense won't save you in court."

In 2025, AI "hallucinations"—where a model confidently generates fabricated facts—have moved from a tech curiosity to a major liability for Consumer Reporting Agencies (CRAs).

For a background investigator, an AI hallucination isn't just a typo; it’s a fabricated criminal record or a non-existent court case that can lead to immediate FCRA litigation.

1. The "Starbuck v. Google" Effect

A landmark 2025 case (Starbuck v. Google) has set the tone for the industry. Activist Robby Starbuck sued after Google’s AI generated false accusations of sexual assault and invented court documents that didn't exist.

  • The "Worm": Tech companies often argue that hallucinations are "unavoidable system properties."

  • The Reality for CRAs: The FCRA rejects the idea that "complexity makes error unavoidable." If your AI "invents" a record, you are strictly liable for failing to maintain "reasonable procedures to assure maximum possible accuracy."

2. How Hallucinations Creep Into Reports

AI doesn't just make up names; it "hallucinates" connections:

  • The "Phantom Match": An AI tool might see a gap in a candidate’s history and "fill it" by pulling a record for a person with a similar name in a different city, presenting it as a confirmed hit.

  • Citation Fabrications: AI has been caught citing non-existent case numbers or misinterpreting "Dismissed" as "Convicted" because it "predicted" the next word in a sequence rather than reading the actual legal status.

  • The "Circular Data" Problem: In 2025, hallucinated records are beginning to seep into public datasets. If a lawyer once used AI to draft a brief with fake cases, and your AI now "scrapes" that brief, the AI will report those fake cases as facts.

3. Regulatory Pressure (CFPB Circulars)

The Consumer Financial Protection Bureau (CFPB) has been clear: Using an algorithm doesn't exempt you from the FCRA.

  • If your AI "evaluates" or "assembles" data to determine someone’s eligibility for a job, you are a CRA.

  • You cannot blame the "black box" of AI for an error. You must be able to explain exactly how the AI reached its conclusion and prove that a human verified the source.


CrimeFX performs criminal record searches in Puerto Rico

rightside one