Bosch, the German multinational most famous for its toasters, drills, and refrigerators, is also one of the world’s leading developers of surveillance cameras. Over the last three years, the company has poured tens of millions of euros into its own startup, Azena, which has the potential to completely transform the surveillance camera industry.
Via Azena, Bosch has led the development of a line of surveillance cameras that relies on edge computing — where each camera has its own processor, operating system, and internet connection — to provide “smart” surveillance of people, objects, and places. Like smartphones, these cameras connect to an app store, run by Azena, where customers can purchase apps from a selection of cutting-edge video analytics tools. These apps allow camera owners to analyze video feeds for different security and commercial purposes.
Here, the devil is in the details: In its documentation for developers, Azena states that it will only carry out basic auditing related to the security and functionality of the software available in its app store. According to the company, responsibility for the ethics and legality of the apps rests squarely on the shoulders of developers and users.
In the rapidly advancing field of video analytics, there is a growing market for software that can transform a video feed into a set of data points about individuals, objects, and locations. Apps currently available in the Azena store offer ethnicity detection, gender recognition, face recognition, emotion analysis, and suspicious behavior detection, among other things, despite well-documented concerns about the discriminatory and intrusive nature of such technologies.
Privacy and human rights researchers expressed concern that by decentralizing and facilitating the creation of powerful surveillance software able to analyze people’s traits and activities without their knowledge, Azena has exponentially raised the possibility for abuse. Should we be worried?
Azena says no.
Developers and users “must be compliant with the law,” said Hartmut Schaper, Azena’s CEO. “If we find out that this is not adhered to, we first of all ask for fixes, and then — depending on how severe the violation of the contract is — we can take apps out of the app store or revoke the user’s license.”
Unlike its parent company, Azena doesn’t produce cameras or develop video analytics tools. Instead, it provides a platform for companies and individual developers to distribute their own applications and takes a cut of the sales — much like the Apple and Google app stores, but for surveillance software. According to Schaper, Google’s app store is the direct inspiration for Azena: Within just a few years of releasing the Android operating system, Schaper noted, Google had revolutionized how smartphones were used and achieved domination over the market. With their new surveillance app store, Azena and Bosch hope to do the same.
And like Google’s integration of Android with other smartphone manufacturers around the world, Bosch and Azena are working with a number of companies that produce surveillance cameras running their operating system. Schaper thinks this will lead to drastic changes in the surveillance economy: “In the end, there will be just two or three operating systems for cameras that dominate the market,” he said, “just as is the case in the smartphone market.”
So far, the strategy has resulted in swift growth: The Azena store currently contains over 100 apps, and Schaper has boasted of how the business model made it possible to provide “the first face mask detection app within two weeks of the COVID-19 pandemic beginning.” Other apps directed at shops and public spaces promise crowd and line counting alongside more intrusive offers of individual identification, face recognition, and biometric detection.
The company has also actively courted new types of software: Azena’s “App Challenge 2021,” which was judged by representatives from a host of major security companies, resulted in apps claiming to detect violence or aggression and offering the ability to track individual movements across multiple cameras.
It’s the second category — applications that allegedly detect emotions, potential aggression, suspicious behavior, or criminality — that Galdon Clavell said can be impossible to do accurately and is often based on junk science. “Identifying a person in a space where they shouldn’t be — that works. But that’s very low-tech.” With the more advanced applications, she said, developers often promise more than they deliver: “From what I’ve seen, it basically doesn’t work.”
“When you move from protecting closed-off areas to actually doing movement detection and wanting to derive behavior or suspicion from how you move or what you do,” Galdon Clavell said, “then you enter a really problematic area. Because what constitutes normal behavior?”
Behind the Scenes
For Bosch and Azena, however, these are early days. “I think we’re just at the beginning of our development of what we can use video cameras for,” Schaper said. Azena aims to go “way beyond the traditional applications that we have today,” he added, and interconnect cameras with a host of other sensors and devices.
Brent Jacot, a senior business development manager at Azena, gave an example of how this might work during a 2020 webinar. Imagine you have a camera app that is good at measuring demographics such as age or gender, Jacot said, and you connect it to another app that controls a gate. “You want to, say, open a gate only if they’re above the age of 18. Then you can take the data from this one app and feed it into the next and create this logical chain to make a whole new use case.”
In this example, the people involved might at least know what was happening. But often, the people subjected to video analytics don’t know that the cameras they are so accustomed to seeing are connected to sophisticated software systems, said Dave Maass, director of investigations at the Electronic Frontier Foundation.
“People have an antiquated vision of what surveillance cameras do,” Maass said. “They’re used to seeing them everywhere, but they just assume the video footage is going to some hard drive or VHS tape and no one is looking at it unless a crime occurs.”
People “don’t see when AI is monitoring it, documenting it, adding metadata to it, and also being trained on it.”
If people knew that the footage was being parsed for signs of emotion, anger, or more obscure traits like suspicion or criminality, they might feel differently about it. “They don’t see when AI is monitoring it, documenting it, adding metadata to it, and also being trained on it,” Maass said. “There’s a disconnect between what people are seeing in their day-to-day lives and what’s happening behind the scenes.” Azena also foresees using publicly sourced surveillance footage to train future video analytics algorithms: An informational graphic in the company’s online portal for developers states that camera users “may contribute to enhancements via crowd-generated data.”
Cameras that connect to the Azena app store run an operating system that is a modified version of Android. By using Google’s open-source smartphone operating system as the base for their cameras, Azena’s platform is open to some 6 million software developers around the world. While other surveillance cameras are limited by proprietary operating systems that can only be worked on by niche developers, Azena’s approach aims to put innovation “on steroids,” according to Felicitas Geiss, the company’s vice president of strategy and venture architecture.
Azena recognizes that security cameras are often targeted by hackers and claims to have hardened its operating system against forced entry. Security experts say that, if done correctly, using Android could mean improved security over proprietary software, given the platform’s open code and frequent updates. But in the case of the cameras connecting to Azena, this might not be the case.
Internet of Things devices often run old software that users don’t think to update, explained Christoph Hebeisen, head of security intelligence research at the mobile security firm Lookout. “That’s why routers get hacked, that’s why security cameras get hacked, and often in very large numbers.”
There are also cases where human error is at fault: Last March, after locating a username and password that were publicly accessible on the internet, a hacking group said it gained access to tens of thousands of cameras produced by the California-based security startup Verkada, some of which were hooked up to video analytics software.
On many platforms, including Android, when developers patch a potential vulnerability, they publish a notice in the form of a Common Vulnerability and Exposures list. Azena, Hebeisen said, appears to be years behind on patching CVEs: Its current operating system only addresses Android CVEs as late as 2019, judging from the webpage where it summarizes system updates.
“That is really a problem,” Hebeisen said. A determined hacker, he explained, could look at the years’ worth of vulnerabilities and work their way backward to develop an exploit.
“Now, these vulnerabilities might be accessible to an attacker externally, so they could attack those devices and possibly take them over,” Hebeisen added. “And they have the resources and time to do this.”
Azena’s CEO disputed the suggestion that the company is behind on patching Android CVEs. Schaper stated that because cameras running Azena’s operating system lack some hardware functionality that modern smartphones have, like Bluetooth, many Android CVEs don’t apply. Schaper said Azena’s security team evaluates all security patches from Google for their relevance to the camera operating system.
Hebeisen remains skeptical. The company’s response “is hard to verify independently,” he said, pointing to specific vulnerabilities in Android core components that, based on its own documentation, Azena appears to have left unpatched.
“The security of this app store and those apps stands and falls with how well they are being vetted.”
“This process is not transparent to the public,” Hebeisen said, adding that he’d like to see the company “publish regular security advisories that list the vulnerabilities that affect their OS along with the corresponding patches.”
More importantly, Hebeisen said, is that the apps on the Azena store are too high stakes to carry so little auditing. “The security of this app store and those apps stands and falls with how well they are being vetted,” he said. “Even with Google Play, sometimes malicious apps slip through — I don’t think this company is nearly as well resourced or would be nearly as careful.”
According to Azena’s documentation for developers, the company checks potential applications “on data consistency” and performs “a virus check” before publishing to its app store. “However,” reads the documentation, “we do not perform a quality check or benchmark your app.”
In comparison to Azena’s inspiration, Google, this appears to be a light-touch process. While Google Play Store developers are also ultimately responsible for the legality of the apps they upload, they are obliged to comply with a barrage of policies covering everything from gambling and “unapproved substances” to intellectual property and privacy.
Google warns developers that “powerful machine learning” is deployed alongside human review to detect transgressions, although widespread SMS scams and the recurrent appearance of stalkerware in the Play Store suggests that this process is not all it’s cracked up to be.
Bosch and Azena maintain that their auditing procedures are enough to weed out problematic use of their cameras. In response to emailed questions, spokespeople from both companies explained that developers working on their platform commit to abiding by ethical business standards laid out by the United Nations, and that the companies believe this contractual obligation is enough to rein in any malicious use.
At the same time, the Azena spokesperson acknowledged that the company doesn’t have the ability to check how their cameras are used and doesn’t verify whether applications sold on their store are legal or in compliance with developer and user agreements.
The spokesperson also said that users are able to develop or purchase applications from outside Azena’s store and sideload them onto cameras running their operating system, allowing users to run powerful video analytics software without any auditing or oversight.
“Further review beyond the contractual obligations of platform users is not possible, because the apps are not Azena’s own products,” the Azena spokesperson wrote. “The application rights remain entirely with the respective developer who offers it in their own name on the Azena platform.”
A Chilling Effect
In Europe, legislators have recognized a need to regulate and control new technologies that make use of machine learning and advanced algorithms, such as those offered on Azena’s platform. The European Union’s proposed Artificial Intelligence Act calls for balancing the benefits and risks of AI, underpinned by the aim of stimulating economic growth. Still, it’s unclear if European regulators will be able to keep up with technological advancements. Where exactly that balance should lie is currently the subject of political negotiations.
As the proposed legislation stands, Azena would likely be classed as a distributor of AI technologies, said Sarah Chander, senior policy adviser at European Digital Rights. In the case of “high-risk” apps, this would mean the company would have to ensure that providers complied with the act’s requirements for transparency, risk management, quality checks, and data accuracy; if Azena suspected noncompliance, it would have to inform the provider or withdraw the app from sale and ensure “corrective actions” were taken. “Low-risk” apps, meanwhile, would be governed by voluntary codes of conduct drawn up by government authorities.
“It’s surveillance capitalism on steroids.”
“I doubt the act will help provide accountability for distributors,” Chander wrote in an email. Even if it did, the proposed rules “don’t capture the root of why this platform is problematic. The reason why we should be concerned with a platform like this is because it is accelerating and promoting the uptake of harmful AI systems, accelerating the sale and use of pseudo-scientific, discriminatory surveillance systems, and finding ways to get these systems to market in more and more efficient ways.”
“It’s surveillance capitalism on steroids,” she added.
Echoing this concern, Jay Stanley, a senior policy analyst at the American Civil Liberties Union, said that the technology is not yet able to live up to its claims. Emotion detection technology is like selling “snake oil.” But the implications are still concerning. “Things like emotion detection are an easy sell for many people,” Stanley said. “You have all these cameras around your building and [developers] think, for example, who wouldn’t want to get a notification if there was an extremely angry person in the area?”
But Stanley is just as worried about the rapid expansion of simple applications of video analytics. “There’s a real concern here that even on the most effective end of the spectrum, where a video analytics system is trying to detect just the raw physical motion or attributes or objects,” he said, “every time you hand a backpack to a friend or something like that, an alarm gets set off and you get approached.”
“That’s going to have a real chilling effect. We’re going to come to feel like we’re being watched 24/7, and every time we engage in anything that is at all out of the ordinary, we’re going to wonder whether it’ll trip some alarm,” Stanley said.
“That’s no way to live. And yet, it’s right around the corner.”
This article was reported in partnership with Der Spiegel.