Without thinking about it, we use our face to unlock our phones. Facial recognition as a way to gain entry to events is also becoming more common. Sounds like it might be convenient, until you see what is happening in China. Cameras equipped with facial recognition are constantly monitoring citizens. This not only enables China to single out shoplifters and traffic violators from a crowd of people, but also individuals who are critical of the regime, for instance.
Such a system will probably never materialize in Europe, but where do we draw the line? Theo Breuers thinks we should ban facial recognition in public spaces. Together with his company People Flows, he is making a digital data vault that is also based on facial recognition. You can store airline tickets, admission tickets or medical data in this vault and you can decide for yourself who you share it with.
How does it work? “It’s really simple,” Breuers laughs. He has been working on facial recognition technology since 2018. “You download the app, enter your details and scan your passport and take a selfie. These are not stored as a single unit, but only as vectors to confirm that it is you.”
Cannot be traced back to an individual
These vectors or digital facial features are linked with a random number to an admission ticket (which you can upload into the app) for a festival, for example. At the festival, visitors look into the camera and if their facial characteristics match the ticket, they are allowed to go in. Breuers: “This is impossible to trace back to an individual. Moreover, the user is always in control. They decide for themselves who can view their data.”
Similarly, you can use the digital vault from People Flows to store other data such as diplomas, medical records or documents. Inside the vault, you can decide what you want from these documents and who you share that with. “You are in charge of your own data this way. Why should we have all kinds of things scattered all over the place? With biometrical data, such as a face, you can store all those sensitive documents in a secure way.”
Currently People Flows is testing its system at the Heracles Almelo soccer club and will soon be launching a pilot at two airports. Breuers is not at liberty to say which ones. According to Breuers, so far, soccer supporters have been responding enthusiastically to the pilot. “But you have to explain carefully what you are doing and make it clear that there is practically no risk of losing your data. They find the speed and ease of being able to gain admission using their face a major advantage. In previous tests at events, you see that about 85 percent of the festival-goers who were approached took part. We are expecting to reach this figure at the Heracles Almelo stadium as well.”
Should it be banned or not?
In his estimation, the system also works extremely well in keeping people with stadium bans away from the entrance gate. “People with stadium bans cannot buy tickets and only people who’ve got tickets can enter the stadium.”
However, he does not support the use of such systems in the public domain, for example, to track down (serious) criminals. “The technology is already advanced enough for this and we have already shown that we can remove a person from a stadium holding 25,000 people. But we are absolutely against this. We will never go in that direction either. We want users to have the choice to determine what happens with their face. We shouldn’t really want facial recognition in public spaces. It should be forbidden. There are other ways to organize society, Breuers points out.
Jeroen van Rest has been working for the Netherlands Organisation for Applied Scientific Research (TNO) now for twenty years. He does research on facial recognition here. According to him, a total ban would not make much sense. The technology has “presumably some useful specific applications – also in public spaces.” It is, of course, already in consumer electronics like phones. This creates familiarity and is a way to introduce the technology. Moreover, there are countries that are already using it. Then we are usually referring to countries that are suspected of using it unethically, such as China or Russia. But for us in Europe, it is educational to look at England, Israel, the US and Australia. These are countries that are perhaps not as far removed from us in terms of culture. The technology is already being used over there. We need to become masters of the technology and figure out why, when and where it can and cannot be used.”
Facial recognition in public spaces
For instance, Van Rest thinks there are several use cases to think about where facial recognition in public spaces could very well be among the options. “A nuclear power plant is a private organization and its security is not a legal task for them as defined under the terms of the General Data Protection Regulation (GDPR). While it is implicit that this kind of plant is properly protected. The protection, or an acute threat, can, in the terms defined by the GDPR, constitute a ‘legitimate interest’ to utilize facial recognition. By conducting experiments in limited and specific use cases, we are developing a form of collective self-confidence. What are the advantages that presents? What about the disadvantages? This gives us a more realistic picture of just what is and isn’t possible and preferable.”
Van Rest contends that uncontrolled growth can lead to worrying incidents. “And I won’t be surprised if those incidents are substantial. But developments in the field of facial recognition are moving so fast. Also, for several years now, to better prevent an unfair spread of errors. For instance, to prevent ethnicity, age or gender from leading to disproportionately more erroneous recognitions. A ban would be throwing the baby out with the bathwater, any knowledge that’s gained would then be wasted.”
Pressure from civil society
“What if the EU were to say that facial recognition will be banned in public spaces? I can imagine that at some point there will be pressure from civil society to use facial recognition in public spaces after all.”
However, there must always be a good reason to do this. Using facial recognition to round up a group of youths who are causing a nuisance at a supermarket is going too far, he believes. “Also, the minister has already said that we will never switch to generic form of live facial recognition. As such, camera surveillance on its own cannot be a valid basis for using this technology.”
Security and privacy
According to Van Rest, it is important to take the privacy of filmed passers-by into account as much as possible when developing facial recognition. At the same time, it is also worthwhile exploring how the technology could still be useful to security agencies. In an experiment last year in collaboration with the Johan Cruyff Arena and the Dutch Police and under the auspices of the TNO, Van Rest worked out an idea from the police: multi-party computation (MPC) as a secure shield around facial recognition. “Municipalities and police have a mutual interest in keeping public spaces safe. Yet one other interest concerns privacy. Why, for example, should all images be shared with the police?”
This is where MPC comes in. Van Rest: ” Imagine that you have two data streams. One from the camera images and the other from the police watchlist. With MPC, you can process these two streams together without having to share the actual data. In other words, you do not need a “trusted” third party. Both parties only have a part of the data, but together they have all the information to find the solution they need.”
Calculating salary averages
“Picture it like calculating an average salary between three or more people. Then each person divides their own salary into three random numbers which combine to add up to their salary. And then if they swap two of those three numbers, the total information – which includes the average salary – is still in the aggregate of that data. But it is no longer traceable to an individual’s salary. This principle can also be applied to the comparison of faces. In that case, you don’t have to share images or watchlists with each other. Processing is based on combined numbers and not on sensitive data. This means that they do not have to share this data with each other,” Van Rest explains.
The experiment in Amsterdam is not intended to be rolled out in practice, Van Rest notes. “It is an exploration of what is possible, we want to show managers and policymakers what is feasible. This will allow us to have a better informed discussion about security and privacy in public spaces. We want to make sure we keep asking the right questions in case someone decides to use it in public spaces.”