A Tennessee grandmother endured over five months in jail after an artificial intelligence-powered facial recognition tool inaccurately linked her to crimes committed in North Dakota, a state she firmly maintains she has never visited. Angela Lipps, 50, was initially arrested in Tennessee on July 14, following a warrant issued weeks earlier from Fargo, over a thousand miles from her home.
While acknowledging “a few errors” in the investigative process, the Fargo Police Department has stopped short of issuing a direct apology to Lipps. Her ordeal highlights growing concerns over the rapid integration of AI technologies into law enforcement practices and the potential for misidentification.

Photo: edition.cnn.com
An Unforeseen Ordeal and Cross-Country Extradition
Angela Lipps’ unexpected nightmare began when a warrant for her arrest was issued in North Dakota, stemming from several instances of bank fraud around Fargo. Investigators utilized a “partner agency’s facial recognition technology” along with other investigative steps to identify a suspect. However, it later emerged that the neighboring West Fargo Police Department had acquired its own AI facial recognition system, Clearview AI, a tool unknown at the executive level of the Fargo Police Department and subsequently prohibited from use.
Clearview AI, known for its vast database of internet-scraped photos, identified a “potential suspect with similar features” to Lipps. This information was then shared with Fargo police. Lipps was arrested in Tennessee on July 14 and spent over three months in a local jail before being extradited to North Dakota in October. Facing felony charges including theft and unauthorized use of personal identifying information, Lipps described her first airplane journey for the extradition as “terrified and exhausted and humiliated.”
Upon arriving in Fargo, a public defender uncovered crucial bank records confirming Lipps’ presence in Tennessee at the time the North Dakota crimes occurred. This exculpatory evidence led to the dismissal of all charges on December 23, and Lipps was released on Christmas Eve.
Police Acknowledge Procedural Flaws Amidst Broader AI Concerns
Fargo Police Chief Dave Zibolski addressed the case, admitting to “a couple of errors” in the process that led to Lipps’ misidentification. One significant error involved the reliance on information from West Fargo’s AI system without understanding its operational parameters. Fargo police also failed to submit surveillance photos related to the fraud cases to the North Dakota State and Local Intelligence Center, a certified and trained authority in facial recognition technology.
In response, Fargo police have banned the use of West Fargo’s AI system and committed to working with state and federal authorities, implementing monthly reviews for all facial recognition identifications. Despite these operational changes, Chief Zibolski refrained from issuing an apology, stating that the full network of individuals involved in the fraud cases is still under investigation. Lipps’ attorneys, however, criticized the lack of “basic investigative efforts” prior to her arrest, asserting that readily available bank records could have prevented her lengthy detention.
The incident with Angela Lipps is not isolated, echoing other instances where AI in policing has come under scrutiny, such as a Baltimore County student being handcuffed due to an AI system misidentifying a Doritos bag as a gun. Experts like Ian Adams, an assistant professor of Criminology and Criminal Justice, caution that the rapid adoption of AI by law enforcement often outpaces rigorous efficacy testing, frequently leading to human error compounding technological flaws. As Lipps’ legal team explores civil rights claims, her case serves as a stark reminder of the critical need for robust oversight and careful human verification in the deployment of powerful AI tools within the justice system.
