Facial recognition is jailing the wrong people, but police keep using it anyway — Tennessee grandmother latest victim of AI-driven misidentification

Angela Lipps speaks with WDAY News during an interview about her wrongful arrest.
(Image credit: Matt Henson / WDAY)

A Tennessee grandmother spent nearly six months in jail after police in Fargo, North Dakota, used facial recognition software to identify her as the primary suspect in a bank fraud case, according to reporting by WDAY News.

Fargo police were investigating a series of bank fraud incidents in April and May last year, in which a woman used a fake U.S. Army ID to withdraw tens of thousands of dollars. Detectives ran surveillance footage through facial recognition software, which returned a match to Lipps. A detective then compared her Tennessee driver's license and social media images to the suspect and concluded that she was the perpetrator based on facial features, body type, and hair. Nobody from the department contacted Lipps before U.S. Marshals arrested her at gunpoint on July 14 while she was babysitting four children.

Article continues below

Lipps sat in a Tennessee county jail for 108 days before North Dakota officers collected her. Her attorney, Jay Greenwood, immediately requested her bank records, and when Fargo police finally met with Greenwood and Lipps on December 19, five months after her arrest, the records showed she had been buying cigarettes and depositing Social Security checks in Tennessee at the time police placed her in Fargo. The case was dismissed on Christmas Eve, but the damage had already been done; she had no money, no coat, and no way home, and subsequently lost her house, her car, and her dog.

Its not unusual

Shockingly, this is just the latest in a series of structural failures that have led to innocent people being persecuted for crimes they didn’t commit. A January 2025 WaPo investigation documented at least eight instances of Americans wrongfully arrested after police found a possible FRT match, and in every case, investigators skipped fundamental steps like checking alibis and comparing physical descriptions that would have cleared the suspect before arrest.

The facial recognition vendors themselves, such as Clearview AI, even attach explicit caveats to their systems. Clearview requires agencies to acknowledge that results "are indicative and not definitive" and that officers must conduct further research before acting on them. According to an April 2024 ACLU submission to the U.S. Commission on Civil Rights, in at least five of seven wrongful arrest cases, police had received explicit warnings that FRT results don’t constitute probable cause but made arrests anyway.

Robert Williams, whose 2020 wrongful arrest in Detroit was the first publicly reported FRT false-positive case, reached a landmark settlement with the city in June 2024 that now requires independent corroborating evidence before any FRT match can be used to seek an arrest warrant. However, only 15 states had enacted any FRT legislation covering law enforcement at the start of 2025, and North Dakota is not among them.

As for Lipps, she is now back home in Tennessee, awaiting an apology from the Fargo Police Department that hasn’t yet come.

Follow 3DTested on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Google Preferred Source

Luke James
Contributor
  • Ralston18
    AI (all caveats aside) can certainly make a mess of things. Not a fan by any means.

    However, the real problem is the breakdown of the judicial system and the lack of "common sense" overall.

    If the facial recognition system had picked out some celebrity that would have been resolved within hours. Or even a family member of such celebrity.

    But find someone old, poor, defenseless, etc. - you can see what happens.

    AI / FRT only started the problem. The judicial system, the police, and likely bureacratic others, made the situation worse, and put it on a path to get even worse. As what happened.

    Others need to be held accountable as well.
    Reply
  • SonoraTechnical
    Luke,
    Thank you for posting this story. They literally ruined the woman's life. It's disgusting isn't it?

    Ralston18 said:
    However, the real problem is the breakdown of the judicial system and the lack of "common sense" overall.

    But find someone old, poor, defenseless, etc. - you can see what happens.

    The judicial system, the police, and likely bureacratic others, made the situation worse, and put it on a path to get even worse. As what happened.

    Others need to be held accountable as well.
    The level of injustice served this woman is beyond belief. Being poor, she didn't have the resources to fight it.
    Reply
  • ravewulf
    We live in such a dark timeline...
    Reply
  • PEnns
    Unbelievable!!

    This woman needs some bad a$$ lawyer and sue the heck out of everybody involved: The Facial rec. Company, the bank and the police. And the horses they rode on too!!
    Reply
  • abufrejoval
    She should count herself lucky: just standing close to false positives means you lose the abililty to complain in other parts of the world.
    Reply
  • DougMcC
    PEnns said:
    Unbelievable!!

    This woman needs some bad a$$ lawyer and sue the heck out of everybody involved: The Facial rec. Company, the bank and the police. And the horses they rode on too!!
    This is how you solve this problem. Get her a multimillion dollar judgement and Fargo will be thinking twice about the value their facial recognition software is bringing them.
    Reply