The Staten Island district attorney’s use of the highly controversial Clearview face recognition system included attempts to dig up the social media accounts of homicide victims and was paid for with equally controversial asset forfeiture cash, according to city records provided to The Intercept.
Clearview has garnered international attention and intense criticism for its simple premise: What if you could instantly identify anyone in the world with only their picture? Using billions of images scraped from social media sites, Clearview sells police and other governmental agencies the ability to match a photo to a name using face recognition, no search warrant required — a power civil libertarians and privacy advocates say simply places too much unsupervised power in the hands of police.
The use of Clearview by the Staten Island district attorney’s office was first reported by Gothamist, citing city records obtained by the Legal Aid Society. Subsequent records procured via New York State Freedom of Information Law request and provided to The Intercept now confirm the initial concerns about the tool’s largely unsupervised use by prosecutors. According to spokesperson Ryan Lavis, the DA’s office “completely stopped utilizing Clearview as an investigative tool last year.”
Yet the documents provide new information about how Staten Island prosecutors used the notorious face recognition tool and show that the software was paid for with funds furnished by the Justice Department’s Equitable Sharing Program. The program lets state and local police hand seized cash and property over to a federal law enforcement agency, whereupon up to 80 percent of the proceeds are then sent back the original state or local department to pocket.
A May 2 letter to Attorney General Merrick Garland by Reps. Jamie Raskin, D-Md., and Nancy Mace, R-S.C., alleged that the federal program is routinely abused by police. “We are concerned that the Equitable Sharing Program creates a loophole allowing state and local law enforcement to seize assets from individuals without bringing criminal charges or a conviction, even in states that prohibit civil asset forfeiture,” reads the letter, first reported by The Hill.
Public records turned over to the Legal Aid Society in response to its request for information about how the Staten Island DA’s office paid for Clearview included a document titled “Guide to Equitable Sharing for State, Local, and Tribal Law Enforcement Agencies,” which outlines the program and how state entities can make use of it. In a letter sent to the Legal Aid Society and shared with The Intercept, the DA’s office confirmed that federal forfeiture proceeds had paid for its Clearview license. Asset forfeiture has become a contentious and frequently abused means of padding department budgets around the country, and critics say the Equitable Sharing program provides police in states with laws constraining asset seizures with a convenient federal workaround. While civil asset forfeiture is permitted in New York, the state places some limits on how and when seizures can be conducted, rules that the federal program could let a local district attorney skirt.
“The revelation that the funds used to access the Clearview AI service was derived from property obtained without due process, from the same individuals who are most at risk to the devastating consequences of its flaws, is nearly dystopian,” said Diane Akerman, an attorney with the Legal Aid Society’s Digital Forensics Unit. “Perversely, the most overpoliced and targeted communities would be footing the bill for such surveillance through police seizures of their assets,” Akerman added.
“These sorts of search tools not only destroy our privacy, but erode the bedrock of democracy.”
Albert Fox Cahn, executive director of the New York-based Surveillance Technology Oversight Project, told The Intercept that there’s a troubling aptness to the funding. “You have New Yorkers whose assets are being stolen by the police to pay for facial recognition software that works by stealing our faces from social media,” Cahn noted in an interview. To face recognition critics like Cahn, Clearview is emblematic of the technology’s ability to simultaneously eradicate privacy expectations and enhance the surveillance powers of the state. “There’s this pattern here of the public’s money and data being taken without consent in these ways that are deemed lawful but seem criminal. … These sorts of search tools not only destroy our privacy, but erode the bedrock of democracy.”
Among the disclosed records is a long list, albeit almost entirely redacted, of Clearview searches conducted by the DA’s office from 2019 to 2021, including the general purpose of the queries and names of the targets, which The Intercept has redacted to protect the privacy of those scrutinized by the DA. These search logs indicate that on many occasions, Clearview was tapped not to identify suspects in criminal investigations but to find and search through the social media histories of people whose identities were already known, including homicide victims and unspecified “personnel.” A handwritten note appended to a search conducted in January 2020 also indicates that the DA’s office used Clearview to assist in a “deportation case” — a law enforcement investigation not typically within the DA’s remit, particularly given New York’s status as a so-called sanctuary city. “Despite what we claim as being a sanctuary city, there’s no law in New York whatsoever that stops a conservative DA’s office like Staten Island from partnering with ICE,” said Cahn, referring to U.S. Immigration and Customs Enforcement.
The search records are indicative of how face recognition technology isn’t just proliferating among government agencies but also becoming used in applications broader than the public may expect. “Typically, the NYPD’s use of facial recognition technology has been to attempt to identify unknown witnesses or suspects,” Akerman explained. “The Richmond County District Attorney’s Office” — Richmond County is coextensive with the Staten Island borough, and the DA operates as a county official — “is engaging in a new use of the technology — as a form of surveillance of a known person’s social media.” Akerman pointed to the fact that the New York Police Department, the country’s largest police force, already makes use of face recognition technologies and questioned why the smallest DA’s office in the city needed such a powerful tool. Akerman also questioned the need for such a powerful tool given that prosecutors already routinely obtain intimately personal data about individuals during criminal investigations. “DA’s offices already obtain warrants, which are largely rubber-stamped, to search individuals’ cellphones, social media, phone location records, etc., regardless of whether there is a connection to the incident.”
Although face recognition is a potentially invasive and dangerous technology no matter how or where it’s deployed, the Peter Thiel-backed Clearview and its right-wing founder have become emblematic of the threat that the powerful and typically unsupervised software poses, particularly given its rapid adoption by police forces across the country. While the company is already eagerly selling its software to surveillance-hungry police departments, its ambitions are far greater. In February, the Washington Post reported that Clearview recently boasted to investors that it was working toward growing its database of faces to 100 billion images by next year, a number it says would mean “almost everyone in the world will be identifiable” with a simple snapshot. In a sign that the company is expanding its clientele in addition to its capabilities, the Ukrainian military has reportedly begun using Clearview to identify Russian corpses.
Critics of Clearview say the technology represents an untenable threat to personal privacy and, by virtue of the fact that it requires no judicial oversight, an assault on Fourth Amendment protections against undue searches. Clearview’s degree of accuracy is unclear, providing another klaxon for civil liberties advocates regardless of efficacy: If the technology works as advertised, its surveillance powers are an existential threat to privacy rights, but if it’s inaccurate, it risks implicating innocent people — particularly people of color — in crimes.
The Staten Island DA’s office declined to answer questions about the expansive use of Clearview documented in the search logs.
Cahn, of the Surveillance Technology Oversight Project, agreed that the disclosed records are a worrying sign that Clearview is being used far more broadly than initially advertised. “It’s increasingly clear that Clearview is not just a facial recognition tool, it’s a social media monitoring tool,” he said. “When so many people have social media accounts that they try to keep anonymous, where they try to keep their names off of the account, this becomes yet another tool to map out what people say, what they post, when they’re trying to keep their identities secret.”