Last year, a Russian startup announced that it could scan the faces of people passing by Moscow’s thousands of CCTV cameras and pick out wanted criminals or missing persons. Unlike much face recognition technology — which runs stills from videos or photographs after the fact — NTechLab’s FindFace algorithm has achieved a feat that once only seemed possible in the science fictional universe of “Minority Report”: It can determine not just who someone is, but where they’ve been, where they’re going, and whether they have an outstanding warrant, immigration detainer, or unpaid traffic ticket.
For years, the development of real-time face recognition has been hampered by poor video resolution, the angles of bodies in motion, and limited computing power. But as systems begin to transcend these technical barriers, they are also outpacing the development of policies to constrain them. Civil liberties advocates fear that the rise of real-time face recognition alongside the growing number of police body cameras creates the conditions for a perfect storm of mass surveillance.
“The main concern is that we’re already pretty far along in terms of having this real-time technology, and we already have the cameras,” said Jake Laperruque, a fellow at the Constitution Project. “These cameras are small, hard to notice, and all over the place. That’s a pretty lethal combination for privacy unless we have reasonable rules on how they can be used together.”
This imminent reality has led several civil liberties groups to call on police departments and legislators to implement clear policies on camera footage retention, biometrics, and privacy. On Wednesday morning, the House Oversight Committee held a hearing on law enforcement’s use of facial recognition technology, where advocates emphasized the dangers of allowing advancements in real-time recognition to broaden surveillance powers. As Alvaro Bedoya, executive director of the Center on Privacy and Technology at Georgetown Law, told Congress, pairing the technology with body cameras, in particular, “will redefine the nature of public spaces.”
The integration of real-time face recognition with body-worn cameras is further along than lawmakers and citizens realize. A recent Justice Department-funded survey conducted by Johns Hopkins University found that at least nine out of 38 manufacturers of body cameras currently have facial recognition capacities or have built in an option for such technology to be used later.
Taser, which leads the market for body cameras, recently acquired two startups that will allow it to run video analytics on the footage the cameras collect, and Taser’s CEO has repeatedly emphasized the development of real-time applications, such as scanning videos for faces, objects, and suspicious activity. A spokesperson for NTechLab, which has pilot projects in 20 countries including the United States, China, the United Kingdom, and Turkey, told The Intercept that its high-performing algorithm is already compatible with body cameras.
Police see the appeal. The captain of the Las Vegas Police Department told Bloomberg in July that he envisions his officers someday patrolling the Strip with “real-time analysis” on their body cameras and an earpiece to tell them, “‘Hey, that guy you just passed 20 feet ago has an outstanding warrant.’”
At least five U.S. police departments, including those in Los Angeles and New York, have already purchased or looked into purchasing real-time face recognition for their CCTV cameras, according to a study of face recognition technology published by Bedoya and other researchers at Georgetown. Bedoya emphasized that it’s only a matter of time until the nation’s body-worn cameras will be hooked up to real-time systems. With 6,000 of the country’s 18,000 police agencies estimated to be using body cameras, the pairing would translate into hundreds of thousands of new, mobile surveillance cameras.
“For many of these systems, the inclusion of real-time face recognition is just a software update away,” said Harlan Yu, co-author of a report on body camera policies for Upturn, a technology think tank.
Civil liberties experts warn that just walking down the street in a major urban center could turn into an automatic law enforcement interaction. With the ability to glean information at a distance, officers would not need to justify a particular interaction or find probable cause for a search, stop, or frisk. Instead, everybody walking past a given officer on his patrol could be subject to a “perpetual line-up,” as the Georgetown study put it. In Ferguson, Missouri, where a Justice Department investigation showed that more than three-quarters of the population had outstanding warrants, real-time face searches could give police immense power to essentially arrest individuals at will. And in a city like New York, which has over 100 officers per square mile and plans to equip each one of them with body cameras by 2019, the privacy implications of turning every beat cop into a surveillance camera are enormous.
“The inclusion of face recognition really changes the nature and purpose of body cameras, and it changes what communities expect when they call for and pay for cameras with taxpayer dollars,” Yu said. “I think there’s a real fear in communities of color, where officers are already concentrated, that these body-worn cameras will become another tool for surveillance rather than a tool for accountability.”
Civil rights group concur that tracking individuals caught on body cameras — either live or using archival footage — could put a chill on First Amendment-protected activities.
“Are you going to go to a gun rights rally or a protest against the president, for that matter, if the government can secretly scan your face and identify you?” Bedoya asked the House Committee in his testimony on Wednesday.
These are not far-fetched concerns, given revelations in recent years of the NYPD’s Demographics Unit, tasked with monitoring the activities of Muslim communities, and ongoing surveillance of Black Lives Matter activists in Ferguson, Baltimore, Washington, D.C., and New York. In a 2010 slideshow, the FBI discussed how face recognition could be used to tag individuals at campaign rallies. And law enforcement officials in Memphis revealed last month that they have used surveillance footage of protesters linked to Black Lives Matter to create a “watchlist” that prohibits those individuals from entering the Memphis City Hall without an escort.
“It’s not hard to imagine the worst way this could play out today, with a digital version of a J. Edgar Hoover-style ‘enemies list,’” Laperruque said, of the use of a real-time watchlist. “Even if we don’t have [a list], the mere threat develops a chilling effect.”
The provisions for such a system are already in place. Other types of real-time searches of biometric databases — such as mobile fingerprinting and rapid DNA tests — are now part of law enforcement routines and face few legal challenges. FBI searches of state driver’s license databases using face recognition software are almost six times more common than federal court-ordered wiretaps, according to the Georgetown study.
The databases, too, have already been built. Georgetown researchers estimated that one in every two faces of adults in the United States — many of whom have never committed a crime — are captured in searchable federal, state, or local databases. The Department of Defense, the Drug Enforcement Administration, and Immigration and Customs Enforcement are just a few of the federal agencies that can gain access to one or more state or local face recognition systems.
Regular interagency data-sharing programs, such as fusion centers, have given officers the ability to track not only people convicted of crimes, but also petty offenders and immigrants. Immigrants entering and exiting the country with visas have already handed over fingerprints and photos of their faces to the Department of Homeland Security. President Trump has demanded the completion of a biometric system for all travelers at the border, and a new bill introduced Tuesday in the House calls for all ICE agents to wear body cameras.
“I think it is absolutely a concern that face recognition would be used to facilitate deportations,” said Rachel Levinson-Waldman, an expert on policing technology at the Brennan Center for Justice at New York University School of Law. “We’re seeing how this administration is ramping up these deportation efforts. They’re looking much more widely.”
But despite these precedents and possibilities, few departments have outlined policies to limit the pairing of facial recognition technology with body camera footage. In August, Yu and colleagues at Upturn surveyed the major city police departments in the country that have equipped — or will soon equip — officers with body cameras. Out of 50 departments, only six had addressed the use of biometrics such as face recognition with their recordings.
Baltimore’s policy appears to be the first to explicitly prohibit using “stored” body-camera video with face recognition, but it still leaves the door open for real-time recognition. Meanwhile, the Boston police department limits “technological enhancements” to the cameras themselves, “including, but not limited to, facial recognition or night-vision capabilities.” This policy has the opposite problem of Baltimore’s, Yu pointed out, as it still could allow for algorithms to analyze the department’s stored footage retrospectively. He said it was essential that police departments limit the amount of time they keep footage that has no obvious evidentiary value.
“When they have this footage around, it will make it possible for departments to identify all the public places where specific individuals have encountered police over the years,” said Yu. “Given that departments are going down the path of better image recognition and better artificial intelligence technologies, they need to make public promises now that this is not the reason why they want to adopt body cameras.”
Even with ideal policies in place, many privacy experts contend that both face recognition and body cameras are ineffective to begin with.
Body cameras have so far failed to deliver the accountability that President Barack Obama promised when his administration provided over $20 million to supply them to law enforcement. Footage from body cams rarely leads to the prosecution of officers who have shot civilians, and their efficacy in reducing police use of force is supported by limited peer-reviewed research. Not all departments have public policies guiding the use of the expensive equipment. Moreover, those that do have policies in place often insufficiently limit the retention of recordings, the ability to view footage prior to writing reports, and whether and when the cameras should be turned on.
Meanwhile, some studies have shown the accuracy of facial recognition matches decreases when identifying black faces and children, when evaluated by human examiners, and when datasets expand. A Government Accountability Office report showed that FBI searches of its Next Generation Identification database over four years returned likely matches with faces only 5 percent of the time.
“The FBI doesn’t test for false positives so it doesn’t know how frequently a system misidentifies someone as a suspect,” Diana Maurer, of the Government Accountability Office, told legislators on Wednesday. “Innocent people could bear the burden of being falsely accused, including bearing the burden of investigators showing up to their home and investigating them.”
Rep. Elijah Cummings of Baltimore added at the hearing, “If you’re black, you’re more likely to be affected by this technology, and that technology is more likely to be wrong. That’s a hell of a combination.”
This month, the National Institute for Standards and Technology released its first-ever test of real-time facial recognition algorithms. While the study doesn’t cover body-worn cameras, it evaluates the uses of face recognition with video in other surveillance scenarios, such as transportation hubs, asylum claims, immigration exits, and restraining orders. The study found that the accuracy of real-time face recognition algorithms had yet to reach a peak functional performance and ultimately depends on the “very difficult goal” of high image quality and resolution.
Concerns about accuracy are compounded by the fact that the companies’ algorithms are in private hands.
“How accurate is the system that puts person in jail because it says that person has warrant out for an arrest?” Yu asked. “These systems haven’t been interrogated by the public, and when they aren’t interrogated, it heightens the stakes far beyond what Microsoft or Google might be doing with their data.”
Vendors have the ability to run analytics on the footage and data that officers collect through their body cameras — and to own the results. Agencies working with Taser pay monthly subscription fees to the corporation’s information hub, evidence.com, for instance, which stores the footage on private servers. It’s these fees for storage, rather than the one-time cost of the material cameras, that are making the stock of Taser’s technology subsidiary soar.
Experts fear that the data and analytics harvested from real-time face recognition may be capitalized on for profit and that the systems will be ripe for overuse. The development of Automatic License Plate Readers, or ALPRs, serves as an instructive example.
ALPR systems, which capture and digitize license plates, were originally pitched as a way to reduce car theft. But with auto theft declining, it was hard to justify the technology’s high cost, and so a private company, Vigilant Solutions, cooked up a scheme to offer it to departments for free. But in exchange, municipalities give Vigilant their records of outstanding arrest warrants and overdue court fees, which the company uses to create a database of “flagged” vehicles. When ALPR cameras spot a flagged plate, officers pull the driver over and ask them to pay the fine or face arrest. For every transaction brokered between police and civilians pulled over with flagged plates, Vigilant gets a 25 percent service fee.
One could imagine a similar arrangement for face recognition. Daniel Fisk, vice president of the body-worn camera vendor Black Mamba Protection, thinks that given the cost of a “luxury” tool like real-time face recognition, it’s likely the software will be introduced on dashboard cameras well before it’s linked to body cameras.
“If face recognition becomes a thing, it will be in the cruisers where the ALPR are,” he said.
The opportunity to collect revenue from data-driven arrests might incentivize municipalities to invest in technologies regardless of their accuracy, argues Laperruque, of the Constitution Project. Law enforcement agencies have historically fallen prey to all kinds of emerging, expensive technologies. And as the case with regular body cameras already makes clear, departments are no stranger to purchasing technologies whether or not they actually work.
The John Hopkins survey of the new capacities of body camera vendors concluded with a word of caution: “The technology is only as good as the people who implement it.”
“Everyone is rallying to find a vaccine for this virus. But for so long, no one cared to find a vaccine for the racial pandemic,” said health care scientist Hugo Caicedo.