A woman identified in court documents as Lipps was wrongfully arrested and jailed for more than five months after an AI facial recognition system misidentified her as a criminal suspect in a case that originated 1,200 miles from where she lived. A Fargo, North Dakota detective used a facial recognition tool to generate a potential match, then manually compared that output against social media profiles and a Tennessee driver's license photo. Based on perceived similarities in facial features, body type, and hairstyle, the detective authorized charges. <a href="/news/2026-03-15-innocent-grandmother-jailed-six-months-after-ai-facial-recognition">Lipps</a> was arrested and held for more than five months before anyone from law enforcement interviewed her. The fallout contributed to the departure of the Fargo Police Chief, publicly characterized as a retirement.
The primary failure was institutional rather than algorithmic. The AI flagged a possible match; a detective confirmed it; prosecutors pursued it; a court permitted months of pretrial detention without demanding basic verification. The sequence closely mirrors what happened with early DNA evidence. During the 1990s and 2000s, prosecutors and courts treated DNA as near-infallible long before the forensic science community had established rigorous error rate standards — a dynamic the Innocence Project has since linked to hundreds of wrongful convictions. Facial recognition is being deployed with less scientific vetting and with accuracy gaps that vary sharply by race and gender. A 2019 NIST evaluation of 189 commercial facial recognition algorithms found that many misidentified Black women at rates up to 100 times higher than white men.
Lipps is not the first. In January 2023, Detroit police arrested Porcha Woodruff, who was eight months pregnant at the time, based on facial recognition evidence in a carjacking case; charges were dropped after six days. In 2020, Robert Williams became the first publicly known American wrongfully arrested solely on a facial recognition match after Michigan State Police misidentified him in a shoplifting case. Each incident followed the same sequence: an automated system produced a hit, reviewers accepted it with minimal scrutiny, and the person arrested had no fast mechanism to challenge the identification before spending days or months in custody. No federal statute currently restricts how law enforcement agencies deploy facial recognition, and most departments have published neither accuracy thresholds nor mandatory human-review protocols. Michigan State Police have since restricted their facial recognition use; Fargo has not said whether it will do the same.