Angela Lipps was watching four of her grandchildren at her Tennessee home on the morning of July 14, 2025, when Fargo police arrived and arrested her at gunpoint.

She had never been to North Dakota. She had never flown on a plane.

A facial recognition algorithm had flagged her as the woman on bank surveillance footage using a fake U.S. Army military ID to withdraw tens of thousands of dollars from Fargo-area accounts. The investigating detective compared the result against Lipps' driver's license and social media photos, noted similarities in her face, build, and hair, and applied for an arrest warrant — without calling her, checking her whereabouts, or seeking any corroborating evidence.

Lipps, 50, sat in a Tennessee jail for 108 days as a fugitive from justice while extradition proceedings moved forward. She was eventually transported to Fargo to face four counts of unauthorized use of personal identifying information and four counts of theft.

Investigators finally interviewed her on December 19, more than five months after her arrest. Her bank records were unambiguous: she had been buying cigarettes, pizza, and Uber Eats in Tennessee at the exact hours the fraud was being committed 1,200 miles away. Charges were dropped five days later, on Christmas Eve.

By then, she had lost her home, her car, and her dog. Released into a North Dakota winter without a ride or warm clothing, she was stranded in Fargo until local defense attorneys pooled money for a hotel room. The founder of the F5 Project, a Fargo-based reentry organization, drove her as far as Chicago so she could make her way back to Tennessee. Lipps says no one from the Fargo Police Department has apologized. Chief David Zibolski, who declined on-camera interviews for more than a week, addressed the matter only briefly at his own retirement press conference.

The case fits a pattern that researchers and civil liberties lawyers have spent years documenting. Facial recognition systems produce higher error rates for women, older people, and people of certain ethnic backgrounds. When detectives anchor their own visual review to an algorithm's output rather than testing it against independent evidence, those errors carry forward into arrests. Detroit, New Orleans, and Atlanta have all produced similar cases in recent years. Legislators in several states are now pushing to require corroborating evidence before facial recognition results can support a warrant.

The person who committed the Fargo bank fraud has not been identified.