
Angela Lipps had never been to North Dakota. She was a grandmother of five, living in Tennessee, minding her own business. But an algorithm decided she was a criminal — and police didn't bother checking if the computer was right.
Lipps spent nearly six months in jail, dragged 1,000 miles from her home state on bogus charges of bank fraud. The technology got it wrong, but police refused to apologize, claiming they were just following AI leads. This is what surveillance capitalism looks like in practice: machines deciding who's guilty, humans shrugging, innocent people destroyed.
She's not alone. Across America, facial recognition and AI surveillance tools are jailing the wrong people — grandmothers, pregnant women, grandfathers, students. The companies building these tools — Clearview AI, Flock Safety, Palantir — are building a private police state, feeding data between themselves and federal agencies, while innocent lives are shattered.

Angela Lipps mugshot photo.
The Grandma Who Lost Everything
Angela Lipps, 50, was arrested in Tennessee on July 14, 2025. She was babysitting four children when U.S. Marshals arrived at gunpoint. The warrant: bank fraud in Fargo, North Dakota. A place she had never visited.
Here's how it happened: Fargo police were investigating a series of bank fraud incidents where a woman used a fake U.S. Army ID to withdraw tens of thousands of dollars. Detectives ran surveillance footage through facial recognition software. The AI returned a "match" — Lipps.
Nobody contacted her before the arrest. Nobody checked if she had ever been to North Dakota. Nobody checked if she had an alibi.
The warrant was signed July 1. Lipps sat in a Tennessee jail for 108 days before being extradited to North Dakota. When she arrived in Fargo on her first-ever airplane ride, terrified and humiliated, her lawyer requested bank records. They showed she had been buying cigarettes and depositing Social Security checks in Tennessee the entire time of the alleged crimes.
The charges were dismissed on Christmas Eve, 2025. But the damage was done. Lipps had lost her house. She lost her car. She lost her dog. She was released from custody on December 24 with no money, no coat, and no way home.
Fargo Police Chief Dave Zibolski acknowledged "a few errors" and pledged changes. But he refused to apologize, saying they still don't know who committed the crimes.
The department had relied on Clearview AI, a startup with a database of billions of photos scraped from the internet without consent. Clearview's database is built on stolen photos from social media. The ACLU sued them; settlements restricted use to law enforcement. But across the country, cops keep using it — and arresting the wrong people.

Sunglasses Hut in shopping center where Harvey Eugene Murphy Jr. was arrested.
The Grandfather Sexually Assaulted in Jail
Harvey Eugene Murphy Jr. was a 61-year-old grandfather who had built a new life over the past 30 years, leaving behind criminal records from the 1980s and 1990s.
In January 2022, a robbery occurred at a Sunglass Hut in Houston, Texas. Two gun-wielding men stole thousands in cash and merchandise. Houston police used AI software from the store's loss prevention team to identify a suspect. The algorithm flagged Murphy.
The problem: Murphy was living in California at the time.
When he returned to Texas to renew his driver's license, he was arrested at the DMV. He was held in jail, where he says he was sexually assaulted by three men in a bathroom — an assault that left him with lifelong injuries.
The Harris County District Attorney eventually determined Murphy was not involved in the robbery. But the damage was irreversible.
Sunglass Hut's parent company EssilorLuxottica and Macy's had used facial recognition software to positively identify Murphy. A Sunglass Hut employee later picked him out of a photo lineup, but Murphy's lawyers allege the loss prevention team met with her beforehand, tainting the identification.
Murphy is now suing both companies. His lawyer, Daniel Dutko, put it plainly: "Mr. Murphy's story is troubling for every citizen in this country. Any person could be improperly charged with a crime based on error-prone facial recognition software just as he was."
The Pregnant Mother Jailed While Getting Kids Ready for School

Porcha Woodruff photo and mugshot side by side.
Porcha Woodruff was eight months pregnant, getting her children ready for school in February 2023, when Detroit police arrived at her home.
She was being arrested for robbery and carjacking — crimes she didn't commit.
Woodruff was taken to jail, where she spent 11 hours. She started having contractions. She had to be treated for dehydration.
The ordeal began with an automated facial recognition search, according to a Detroit Police Department investigator's report. A month later, the Wayne County prosecutor dismissed the case. Woodruff, who had never been in trouble with the law, had been dragged from her home, separated from her children, and forced to give birth under the shadow of criminal charges.
Detroit Police Chief James White later admitted Woodruff's photo should not have been used in the lineup to begin with. But the system that produced the match continued operating without restriction.

Photograph of Nijeer Parks.
The Fathers Arrested in Front of Their Families
Robert Williams was arrested in January 2020 outside his home in Farmington Hills, Michigan, in front of his wife and two young daughters. Police held him for 30 hours after an algorithm identified him as a suspect in a robbery committed a year and a half earlier.
The match was wrong. Williams sued. In June 2024, he reached a landmark settlement with Detroit requiring independent corroborating evidence before any facial recognition match can be used to seek an arrest warrant.
Nijeer Parks was wrongfully arrested in 2019 in Woodbridge, New Jersey, after police used face recognition technology to incorrectly flag him as a shoplifting suspect. Police were so intent on building a case that they cast aside DNA and fingerprint evidence pointing to another suspect.
Michael Oliver, another Detroit man, was also arrested after being misidentified by facial recognition technology. Jason Vernau spent three days behind bars in Miami after being accused of cashing a fraudulent $36,000 check. Police had relied on an image of the wrong person and failed to check Vernau's bank accounts.
The Pattern: Innocence Doesn't Matter
What connects these cases? The victims all share characteristics that make them vulnerable to AI misidentification and police indifference:
- They lack political power. Grandmothers, grandfathers, working-class parents.
- They can't afford lawyers needed to fight back quickly.
- Police treated AI output as gospel, skipping basic investigative steps that would have cleared them.
A January 2025 Washington Post investigation documented at least eight instances of Americans wrongfully arrested after police found a possible facial recognition match. In every case, investigators skipped fundamental steps: checking alibis, comparing physical descriptions, verifying locations.
The facial recognition vendors themselves attach explicit caveats to their systems. Clearview requires agencies to acknowledge that results "are indicative and not definitive" and that officers must conduct further research before acting on them.
According to an April 2024 ACLU submission to the U.S. Commission on Civil Rights, in at least five of seven wrongful arrest cases, police had received explicit warnings that facial recognition results don't constitute probable cause — but made arrests anyway.

Flock camer with solar panel
The Companies Building the Surveillance State
Clearview AI
Clearview AI built its database by scraping billions of photos from Facebook, Twitter, Instagram, and other social media platforms without consent. The company now sells access to law enforcement across the country.
After lawsuits from the ACLU and others, Clearview settled a landmark case in Illinois, agreeing to stop selling access to its database to any entity in the state — including state and local police — for five years. But the company continues operating elsewhere, and police departments in most states face no restrictions.
Clearview's technology is error-prone, particularly for people of color. A 2019 national study of over 100 facial recognition algorithms found they did not work as well on Black and Asian faces. The federal government's own tests found higher error rates for women, people of color, and the elderly.
Flock Safety
Flock Safety operates a different kind of surveillance: automated license plate readers (ALPRs) mounted on poles across America. The company claims to operate in over 5,000 communities across 49 states, performing over 20 billion scans of vehicles every month.
The system doesn't just capture license plates. It logs make, model, color, bumper stickers, dents, and distinguishing features — creating searchable databases of virtually every car in America. Flock claims this "vehicle fingerprint" technology is unique, but it's a dragnet.
Here's the problem: Flock shares its data.
The company's network of cameras can be integrated into predictive policing platforms like Palantir, and Flock has a contract to hand over its data to Palantir directly.
In October 2025, Flock announced a partnership with Amazon's Ring security products, where residents with Ring cameras could share video data to public safety agencies. The partnership came under fire after Amazon ran a Super Bowl ad depicting AI surveillance finding missing pets. The backlash was so intense that Amazon and Flock canceled the integration.
Flock faces lawsuits and community pushback across the country. In June 2024, a judge in Norfolk, Virginia ruled that collecting location data from the city's 172 Flock ALPRs constitutes a search under the Fourth Amendment, and cannot be used as evidence in criminal cases when collected without a warrant.
In 2025, it was reported that Flock data had been queried for use in immigration enforcement. A pilot program with Customs and Border Protection and Homeland Security Investigations was initiated to help combat human trafficking and fentanyl distribution. Flock halted the program in August because of "confusion and concerns" about the purpose of the investigations — but the company's CEO had previously falsely stated Flock did not have federal contracts.
In December 2025, security researchers found that at least 60 of Flock's "Condor" cameras were exposed to the open internet, where anyone could watch them, download 30 days of video archive, change settings, see log files, and run diagnostics.
A 2021 study of Flock's Falcon camera by surveillance research firm IPVM found a 10% error rate in camera output. Flock halted sales to IPVM and disputed the findings — but the company admits inaccuracies in its cameras have resulted in wrongful arrests in several cities.
Palantir
Palantir Technologies operates at a different scale: data fusion. The company's platforms — Foundry and Gotham — help law enforcement and military agencies unify massive datasets from tax records to biometrics, identifying connections between people, places, and events.
Palantir has received more than $900 million in federal contracts since Trump took office. Its most controversial work is with U.S. Immigration and Customs Enforcement (ICE).
In April 2025, Palantir was awarded a $30 million contract to build "ImmigrationOS," a surveillance platform for ICE. This was an addition to an existing contract worth about $17 million.
Until 2022, Falcon — an app Palantir custom-built for ICE — was used to track agents' and targets' locations during enforcement operations, record and share information from in-person encounters in real-time, and search federal and privately owned databases for names, locations, vehicles, and passport information.
The Guardian described Palantir as the "corporate backbone of ICE that the agency is relying on for surveillance and deportations."
In January 2026, the Electronic Frontier Foundation reported that ICE is using a Palantir tool that feeds on Medicaid and other government data to track people for arrest. The EFF called it "exactly the kind of data privacy abuse that EFF has been warning about."
The Data Pipeline
Here's how the surveillance pipeline works:
- Flock cameras scan license plates and vehicles on public roads, building a database of who drives where and when.
- Police query Clearview AI's database of scraped social media photos to match faces from surveillance footage.
- Both data streams feed into Palantir's platforms, where they're fused with government records — tax data, Medicaid data, biometrics.
- Palantir provides AI-powered predictions and connections to ICE, local police, and federal agencies.
- Police act on AI leads with minimal human oversight, arresting innocent people.
The companies claim their technology makes us safer. The reality: they're building a surveillance architecture that mistakes innocents for criminals while making it nearly impossible to drive anywhere without being tracked.
The Regulatory Vacuum
Only 15 states had enacted any facial recognition legislation covering law enforcement at the start of 2025. North Dakota, where Angela Lipps was arrested, is not among them.
There are no federal restrictions on police use of facial recognition technology. There are no federal restrictions on ALPR data sharing with federal agencies. There are no federal restrictions on police use of predictive policing algorithms.
The Fourth Amendment is supposed to protect against unreasonable searches and seizures. But when surveillance is automated and ubiquitous, constitutional protection evaporates. A Norfolk judge recognized this in 2024, ruling that ALPR location databases constitute a search under the Fourth Amendment.
But that ruling applies to one city. Across America, the technology continues operating.

An immigration agent holds up his phone to a journalist in Minneapolis on Jan. 6, 2026. Credit: Chris Juhn
What Happens When Machines Get It Wrong
When Angela Lipps was released on Christmas Eve, she had nowhere to go. She had lost her house, her car, her dog. She had spent nearly six months in jail for crimes committed in a state she had never visited.
When Harvey Murphy Jr. was released from jail after being proven innocent, he carried lifelong injuries from sexual assault by three men in a bathroom.
When Porcha Woodruff was released, she had to give birth to her child under the shadow of criminal charges that never should have been filed.
When Robert Williams was released, his two young daughters had watched their father being arrested in their driveway by armed police.
These aren't glitches. This is a system working as designed.
The Companies Keep Getting Rich
Clearview AI, Flock Safety, and Palantir are all privately held companies with massive valuations.
Flock Safety has raised $950 million in venture funding, with a $7.5 billion valuation. It operates in 6,000 municipalities across the United States.
Palantir trades publicly with a market capitalization exceeding $50 billion. Its federal contracts exceed $900 million under the current administration.
Clearview AI's valuation is not public, but it sells access to billions of scraped faces to thousands of law enforcement agencies nationwide.
The companies profit from selling surveillance. The costs — wrongful arrests, sexual assault, lost homes, traumatized children — are externalized onto the people they watch.
What Needs to Happen
Until there are meaningful restrictions on AI surveillance, these cases will continue. Here's what's needed:
- Federal ban on warrantless facial recognition use by police. AI matches should never be probable cause for arrest. They should be investigatory leads only, requiring independent corroborating evidence before any detention.
- Federal ban on warrantless ALPR location tracking. The Fourth Amendment requires it. The Norfolk judge got it right — that ruling should be national law.
- Criminal liability for companies that enable wrongful arrests. When Clearview, Flock, or Palantir's technology leads to false arrests, the companies should face consequences, not just the victims.
- Ban on data sharing with ICE. Federal surveillance tools should not feed immigration enforcement, which targets immigrant communities for political persecution, not public safety.
- Public transparency of AI algorithms. Taxpayer-funded technology that decides who gets arrested cannot be a black box.
- End private surveillance contracts. Police should not be buying surveillance tools from private companies that operate outside democratic oversight.
The technology exists to surveil everyone all the time. The question is whether we accept a society where algorithms can destroy lives without accountability, or whether we reclaim the right to be innocent until proven guilty — not until an algorithm says so.
The police departments that arrested Angela Lipps, Harvey Murphy, Porcha Woodruff, Robert Williams, and Nijeer Parks all failed at the basic work of policing: checking facts, verifying evidence, protecting the innocent. They substituted human judgment for AI output, and innocent people paid the price.
Until we impose limits on this technology, the next innocent person jailing is just one algorithmic error away.
Sources & Methodology(10 sources)
- WDAY NewsDocument
WDAY News reporting on Angela Lipps, Tennessee grandmother wrongfully arrested after AI facial recognition misidentification
- Tom's HardwareDocument
Tom's Hardware investigation of AI-driven misidentification, documenting Angela Lipps case and at least eight wrongful arrests across America
- CBS NewsDocument
CBS News reporting on Harvey Eugene Murphy Jr., a 61-year-old grandfather wrongfully arrested by facial recognition and sexually assaulted in jail
- ACLU - Police WarningsDocument
ACLU report on police receiving explicit warnings that FRT results don't constitute probable cause, but making arrests anyway
- American Civil Liberties UnionDocument
ACLU case details on Robert Williams, first publicly reported FRT false-positive case, arrested in front of wife and daughters
- Wikipedia - Flock SafetyDocument
Wikipedia article on Flock Safety, covering ALPR cameras, Palantir integration, and privacy controversies
- EFFSource
- EFF - Palantir Medicaid DataDocument
EFF report on ICE using Palantir tool that feeds on Medicaid data to stalk people for arrest
- StopFlock.comDocument
StopFlock.com tracking Palantir's million contract with ICE to develop ImmigrationOS surveillance platform
- WiredDocument
Wired reporting on Palantir's million award to build ImmigrationOS surveillance platform for ICE
Filed Under
Frequently Asked Questions
- How many people have been wrongfully arrested by facial recognition technology in the US?
- A January 2025 Washington Post investigation documented at least eight instances of Americans wrongfully arrested after police found a possible facial recognition match. In every case, investigators skipped fundamental steps like checking alibis, comparing physical descriptions, and verifying locations.
- Are facial recognition systems accurate?
- Facial recognition technology is error-prone, particularly for people of color. A 2019 national study of over 100 facial recognition algorithms found they did not work as well on Black and Asian faces. The federal government's own tests found higher error rates for women, people of color, and the elderly.
- Is using facial recognition evidence enough for an arrest warrant?
- According to ACLU and industry standards, facial recognition results are indicative and not definitive and should not constitute probable cause. They are meant to be investigatory leads only, requiring independent corroborating evidence before any detention. At least five of seven wrongful arrest cases documented by the ACLU involved police who received explicit warnings that FRT results do not constitute probable cause but made arrests anyway.
- What companies are building AI policing infrastructure?
- Three major companies dominate the AI policing landscape: Clearview AI, Flock Safety, and Palantir. Clearview AI built its database by scraping billions of photos from social media platforms without consent. Flock Safety operates automated license plate readers in over 5,000 communities across 49 states. Palantir Technologies operates data fusion platforms and has received more than $900 million in federal contracts.
- What restrictions exist on AI surveillance technology?
- Only 15 states had enacted any facial recognition legislation covering law enforcement at the start of 2025. North Dakota, where Angela Lipps was arrested, has none. There are no federal restrictions on police use of facial recognition, ALPR data sharing with federal agencies, or predictive policing algorithms.
- What happened to the victims of these wrongful arrests?
- Angela Lipps lost her house, car, and dog after six months in jail. Harvey Eugene Murphy Jr. was sexually assaulted in jail and suffers lifelong injuries. Porcha Woodruff spent 11 hours in jail while eight months pregnant and had to give birth under the shadow of criminal charges. Robert Williams' young daughters watched their father be arrested. These aren't glitches — this is a system working as designed.
