Skip to main content
  1. Resources/

AI Facial Recognition Wrongful Arrests: Legal Guide

Table of Contents

When AI Gets the Wrong Face
#

Related: Clearview AI Class Action

The March 2025 settlement in the Clearview AI class action—valued at $51.75 million—provides a 23% equity stake to anyone whose photos were scraped from the internet. See our guide: Clearview AI Settlement: What You Need to Know.

Facial recognition technology is sending innocent people to jail. At least eight Americans have been wrongfully arrested based on flawed AI face matches—and those are only the cases we know about. In nearly every documented case, the victim was Black, reflecting the well-documented racial bias in facial recognition systems that misidentify people of color at significantly higher rates.

The landmark 2024 settlement in Williams v. City of Detroit—$300,000 plus the nation’s strongest police department restrictions on facial recognition—shows that victims can fight back. But as police departments nationwide continue deploying this technology with minimal oversight, more wrongful arrests are inevitable.

8+
Wrongful Arrests
Known facial recognition cases
$300K
Williams Settlement
Detroit 2024
$300K
Parks Settlement
New Jersey
15
States
With facial recognition legislation

The Wrongful Arrest Epidemic
#

Documented Cases
#

As of January 2025, at least eight wrongful arrests linked to facial recognition technology have been publicly documented. In nearly every case, the victim was Black.

Robert Williams (Detroit, 2020): First publicly reported wrongful arrest from facial recognition. Williams was arrested in front of his wife and two young daughters, detained for 30 hours in an overcrowded cell. The software incorrectly matched his driver’s license to a 2018 security video from a Shinola watch store. Settlement: $300,000 plus landmark policy reforms.

Nijeer Parks (New Jersey, 2019): Arrested for shoplifting based solely on a facial recognition “possible hit” from a blurry, shadowed driver’s license image. Spent 10 days in jail and nearly 10 months under prosecution. Police ignored DNA and fingerprint evidence pointing to another suspect. Case dismissed for lack of evidence. Settlement: $300,000.

Porcha Woodruff (Detroit, 2023): Arrested for carjacking while eight months pregnant. Nothing in surveillance images or eyewitness statements indicated a pregnant woman was involved. Detained for 11 hours, suffered contractions from stress. First woman to publicly report wrongful arrest from facial recognition. Litigation ongoing.

Michael Oliver (Detroit): Wrongfully arrested based on facial recognition match plus tainted photo lineup. Police obtained arrest warrant using only AI match and witness identification from a lineup constructed around the facial recognition result. Litigation ongoing.

Randall Reid (Georgia): Arrested based on facial recognition match. Case part of growing pattern of wrongful arrests in the South. Details emerging through litigation.

Alonzo Sawyer: Another wrongful arrest victim identified by civil rights advocates. Part of documented pattern showing facial recognition failures disproportionately affect Black Americans.

Wrongful Arrest - Facial Recognition

Williams v. City of Detroit

$300,000
Settlement

First publicly reported facial recognition wrongful arrest. Robert Williams arrested in front of family, detained 30 hours based on flawed AI match to Shinola store theft video. Settlement included $300K payment plus nation's strongest police department restrictions on facial recognition use.

Detroit, MI 2024
Wrongful Arrest - Facial Recognition

Parks v. Woodbridge

$300,000
Settlement

Nijeer Parks arrested for shoplifting based on facial recognition 'possible hit' from blurry image. Spent 10 days in jail, nearly 10 months under prosecution. Police ignored DNA/fingerprint evidence pointing elsewhere. Case dismissed; Woodbridge settled $300K without admitting wrongdoing.

New Jersey 2023

The Racial Bias Problem
#

Research consistently shows facial recognition technology is significantly less reliable for people of color:

NIST Studies:

  • Asian and African American faces have 10-100 times higher false positive rates
  • Error rates vary dramatically across demographic groups
  • Algorithms trained primarily on white faces perform worse on darker skin tones

Real-World Impact:

  • 7 of 8 known wrongful arrest victims are Black
  • Porcha Woodruff is the first woman to publicly report a wrongful arrest
  • Pattern suggests many more unreported wrongful arrests exist

Why This Matters Legally:

  • Racial disparities may support equal protection claims
  • Evidence of discriminatory impact strengthens civil rights cases
  • Pattern evidence demonstrates systemic problems

The Tip of the Iceberg

These eight cases are only the ones we know about. Most wrongful arrests likely go unreported because:

  • Defendants accept plea deals without knowing facial recognition was used
  • Police don’t disclose facial recognition involvement
  • Victims lack resources to investigate how they were identified
  • Cases settle confidentially without public disclosure

The Williams Settlement: A Blueprint
#

What Happened
#

In January 2020, Detroit police wrongfully arrested Robert Williams outside his home based on a facial recognition match to a 2018 shoplifting video from a Shinola watch store. The AI incorrectly matched Williams’ driver’s license photo to a suspect in the grainy security footage.

The Arrest:

  • Williams arrested in front of his wife and two young daughters
  • Neighbors witnessed the arrest
  • Detained for 30 hours in overcrowded, dirty cell
  • Interrogated about a crime he didn’t commit

The Investigation Failures:

  • Police relied solely on facial recognition match
  • No corroborating investigation
  • Photo lineup constructed around AI result
  • Obvious differences between Williams and actual suspect ignored

Settlement Terms (June 2024)
#

The Williams settlement achieved the nation’s strongest police department restrictions on facial recognition:

Monetary Relief:

  • $300,000 to Robert Williams

Policy Reforms (Most Significant):

  1. No arrests solely based on facial recognition — AI results can only serve as investigative leads
  2. Photo lineup restrictions — Police cannot proceed directly from facial recognition to witness identification
  3. Arrest warrant limitations — Warrants cannot be based solely on facial recognition results
  4. Required corroboration — Traditional investigative methods must confirm AI leads
  5. Mandatory training — Officers trained on technology’s limitations and racial disparities
  6. Case review requirement — Review of 2017-2023 cases where facial recognition was used
  7. Prosecutor notification — If past arrests lacked independent evidence, prosecutors will be notified
  8. Court oversight — Four years of continued court jurisdiction to enforce agreement

Why This Matters for Victims
#

The Williams settlement established that:

  • Wrongful arrest victims can recover substantial damages
  • Policy reforms are achievable through litigation
  • Pattern of failures supports systemic claims
  • Courts will impose meaningful restrictions on police use

Legal Claims for Wrongful Arrest Victims#

Constitutional Claims (Section 1983)
#

Fourth Amendment:

  • Arrest without probable cause
  • Unreasonable seizure based on flawed technology
  • Failure to adequately investigate before arrest

Fourteenth Amendment:

  • Equal protection violations from racially biased technology
  • Due process violations from unreliable identification methods
  • Discriminatory enforcement patterns

Key Legal Arguments:

  • Facial recognition alone cannot establish probable cause
  • Police cannot ignore exculpatory evidence (DNA, fingerprints, witness descriptions)
  • Constructing lineups around AI results taints identification process
  • Racial disparities in technology performance create discriminatory impact

State Law Claims
#

False Arrest/False Imprisonment:

  • Arrest without legal justification
  • Detention without probable cause
  • Available in most states

Malicious Prosecution:

  • Criminal proceedings initiated without probable cause
  • Proceedings terminated in victim’s favor
  • May require showing of malice or improper purpose

Negligence:

  • Failure to properly investigate
  • Reliance on flawed technology without verification
  • Failure to train on technology limitations

Intentional Infliction of Emotional Distress:

  • Extreme and outrageous conduct
  • Particularly strong where arrest occurred publicly (like Williams)
  • Pregnancy complications (like Woodruff) support claims

Who Can Be Sued
#

Individual Officers:

  • Officers who made arrest
  • Detectives who relied on flawed AI match
  • Supervisors who approved arrest without adequate review

Police Department/Municipality:

  • Policy claims (failure to train, inadequate procedures)
  • Monell liability for constitutional violations
  • Respondeat superior for employee negligence

Facial Recognition Vendors:

  • Product liability for defective technology
  • Failure to warn about error rates
  • Failure to disclose racial disparities
  • May be difficult but worth exploring
Wrongful Arrest - Pregnant Woman

Woodruff v. City of Detroit

Pending
Litigation Ongoing

Porcha Woodruff, eight months pregnant, arrested for carjacking. Nothing in evidence suggested pregnant woman's involvement. Detained 11 hours, suffered contractions from stress. First woman to publicly report facial recognition wrongful arrest. ACLU filed amicus brief December 2024 arguing AI reliance failed to establish probable cause.

Detroit, MI 2023-Present

State Legislation Landscape
#

States with Facial Recognition Laws (2025)
#

Fifteen states have enacted legislation regulating facial recognition in policing:

StateKey Provisions
WashingtonRequires accountability reports, bias audits
OregonLimited police use restrictions
MontanaRestrictions on government use
UtahWarrant requirements for certain uses
ColoradoBias testing requirements
MinnesotaPolice use restrictions
IllinoisBIPA covers biometric data generally
AlabamaLimited regulations
VirginiaUse restrictions
MarylandGovernment use limitations
New JerseyDisclosure requirements
MassachusettsMoratorium/restrictions
New HampshireGovernment use limitations
VermontUse restrictions
MaineGovernment use limitations

What This Means for Your Case
#

In states with restrictions:

  • Violations of state law strengthen claims
  • May create private right of action
  • Evidence of non-compliance supports negligence

In states without restrictions:

  • Federal constitutional claims still available
  • Common law false arrest claims apply
  • Policy arguments based on other states’ reforms

ACLU’s Position
#

The ACLU has called for a federal moratorium on facial recognition use by law enforcement and immigration agencies, arguing:

  • Technology is fundamentally unreliable
  • Racial bias cannot be adequately corrected
  • Innocent people are harmed
  • No adequate safeguards exist

Evidence Preservation for Victims
#

If You Suspect Facial Recognition Led to Your Arrest
#

Immediate Steps:

  1. Request your file — Police records may reveal facial recognition use
  2. Document everything — Arrest circumstances, detention conditions, witnesses
  3. Get medical records — If stress caused physical symptoms
  4. Don’t talk to police — Without an attorney present
  5. Contact an attorney — Who handles civil rights cases

Discovery Targets
#

Evidence TypeWhy It Matters
Facial recognition search recordsShows AI was used and what it found
Photo lineup documentationWhether constructed around AI result
Investigation notesWhat corroboration (if any) was done
Officer training recordsKnowledge of technology limitations
Department policiesWhether protocols were followed
Vendor contractsTechnology specifications and warnings
Similar casesPattern of failures with same system
FBI/NIST studiesError rates for specific technology

How to Know if Facial Recognition Was Used
#

Many victims don’t know facial recognition led to their arrest. Red flags include:

  • Police seemed certain you were the suspect despite weak evidence
  • Photo lineup was presented early in investigation
  • No eyewitness identified you before your photo was shown
  • Your photo appeared in a lineup even though you don’t match witness descriptions
  • Charges were dropped quickly once you challenged the identification
  • Police can’t explain how they identified you as a suspect

Attorney Discovery:

  • Subpoena police records and investigation files
  • FOIA requests for facial recognition policies and use logs
  • Deposition of arresting officers about identification process
  • Expert analysis of facial recognition technology used

Damages and Recovery
#

Potential Damages
#

Compensatory Damages:

  • Lost wages during incarceration and prosecution
  • Legal fees for criminal defense
  • Medical expenses (physical and mental health)
  • Damage to reputation
  • Loss of employment opportunities

Non-Economic Damages:

  • Pain and suffering
  • Emotional distress
  • Humiliation (especially public arrests)
  • Loss of consortium
  • Interference with family relationships

Punitive Damages:

  • Available where conduct was willful or reckless
  • Stronger where police ignored exculpatory evidence
  • Pattern of similar failures supports punitive claims

Settlement Benchmarks
#

CaseAmountKey Factors
Williams (Detroit)$300,000First reported case, major policy reforms
Parks (New Jersey)$300,00010 days jail, 10 months prosecution
Woodruff (Detroit)PendingPregnancy, public humiliation

These settlements suggest strong facial recognition wrongful arrest claims with documented harm can recover $300,000+. Cases involving:

  • Longer detention
  • More severe harm
  • Clear police misconduct
  • Pregnancy or medical complications

May recover substantially more.


Frequently Asked Questions
#


Related Resources#


Wrongfully Arrested Due to AI?

If you were arrested, detained, or prosecuted based on facial recognition technology—especially if charges were later dropped or you were proven innocent—you may have civil rights claims against police and the municipality. Connect with attorneys experienced in police misconduct and emerging technology civil rights litigation.

Get Free Consultation

Related