AI Facial Recognition Wrongly Imprisons Tennessee Grandmother

A Tennessee grandmother was jailed for more than five months after a facial-recognition match tied her to bank fraud in Fargo, North Dakota, despite her never having been to that state; later evidence showed she was in Tennessee and charges were dismissed.

The case centers on Angela Lipps, a 50-year-old grandmother from Tennessee who spent over five months behind bars after law enforcement relied on an AI-driven facial-recognition match. She was arrested in Tennessee based on a warrant issued in Fargo and transported more than 1,000 miles away for prosecution. Her attorneys later produced records showing she could not have been at the scenes the system flagged.

From CNN:

A Tennessee grandmother spent more than five months in jail after police used an AI facial recognition tool to link her to crimes committed in North Dakota – a state she says she’d never been to before.

Police in Fargo, North Dakota, have acknowledged “a few errors” in the case and pledged changes in their operations but stopped short of issuing a direct apology.

Angela Lipps, 50, was first arrested in Tennessee on July 14, according to a statement from the Fargo Police Department and a verified GoFundMe for Lipps.

Unbeknownst to Lipps, a warrant had been issued for her arrest weeks earlier – in Fargo, over 1,000 miles away from her Tennessee home. Months before, several instances of bank fraud had occurred in and around Fargo, according to police.

Fargo Police Chief Dave Zibolski said, “We’re happy to acknowledge when we make errors, and we’ve made a few in this case for sure,”. Fargo Mayor Tim Mahoney explained, “When the chief found some errors that were made in this area where we got our facial recognition and how that worked, we immediately addressed it.”

A North Dakota judge signed a warrant for Lipps’ arrest on July 1, 2025, after a string of bank fraud incidents in Fargo. Investigators had used a facial-recognition search from another agency that flagged Lipps as a potential match, and federal authorities executed an arrest in Tennessee weeks later. Once in custody, the case moved forward largely on that digital identification rather than direct corroborating evidence.

The authorities accused Lipps of orchestrating a fake identification and theft scheme even after her legal team pointed out that there was no proof she had ever been to North Dakota. Those accusations kept her detained through months of legal limbo while records were tracked down and motions were filed. The reliance on a single AI match became the focal point of both public concern and internal review.

After Lipps was transferred to Fargo, one of her lawyers uncovered bank records demonstrating she was in Tennessee at the times the crimes occurred. Prosecutors were informed of the exculpatory evidence on December 12, and the charges were dismissed without prejudice on December 23. That timeline left Lipps confined far from home during an extended process whose core evidence had been undermined.

Zibolski also acknowledged procedural gaps, saying, “There were steps that we overlooked” and that “this is a big training issue from that perspective because there could have been other steps maybe that if it was reviewed under this new process, the supervisor or unit commander would have said ‘let’s maybe try a, b and c before we ever take it to the State’s Attorney’s Office.” Those remarks underline how agency workflows and oversight did not catch the flaws before arrest warrants were sought. The department has signaled policy and training adjustments as part of its internal response.

Lipps’ case joins several high-profile examples where facial-recognition matches produced wrongful arrests. In Detroit in 2023, Porcha Woodruff was arrested while eight months pregnant and held for about 10 hours before the case unraveled. Robert Williams in Michigan was misidentified and detained for almost 30 hours in a separate incident. These failures reveal a pattern of technology outpacing the safeguards around its use.

The episode raises tough questions about how law enforcement should handle automated matches, what verification steps are mandatory before arrest, and how victims of misidentification are made whole. Policy changes and careful oversight are being discussed in many jurisdictions, but for people like Lipps the immediate harm is both personal and severe. The practical fallout—lost freedom, legal fees, and trauma—is a reminder that systems using AI need rigorous checks before they become the basis for detaining individuals.

Picture of The Real Side

The Real Side

Posts categorized under "The Real Side" are posted by the Editor because they are deemed worthy of further discussion and consideration, but are not, by default, an implied or explicit endorsement or agreement. The views of guest contributors do not necessarily reflect the viewpoints of The Real Side Radio Show or Joe Messina. By publishing them we hope to further an honest and civilized discussion about the content. The original author and source (if applicable) is attributed in the body of the text. Since variety is the spice of life, we hope by publishing a variety of viewpoints we can add a little spice to your life. Enjoy!

Leave a Replay

Recent Posts

Sign up for Joe's Newsletter, The Daily Informant