©Pixabay

The professor would never have imagined that his column would have had such an impact. Over the past few days, he had been approached by people from the security sector and the science world, by app-builders, conspiracy theorists, two surgeons and one chaplain.

But it all started with a phone call when ‘anonymous’ (with a small a) appeared on the professor’s screen. The professor instantly knew what that meant.

“Hacker!” the professor exclaimed, “How are you doing in these bleak times?”

Without any form of greeting, the hacker cried out: “Have you seen what’s happening right now?! Work is being done on an app that makes it possible to do contact tracing. But one that respects privacy! Precisely what you wrote about in your last column.” This all came out as one sentence, without him stopping or pausing to take a breath. It took a moment before the professor understood exactly what the hacker meant.

In his earlier article ‘The professor doesn’t think ahead‘, he had written about dangers that arise when commercial parties like Google have far more data at their disposal than our national government. He had written that, in theory, Google could comprehend, combat and counter COVID-19 more effectively than the RIVM could (the Dutch National Institute for Public Health and the Environment).

“I’ll send you a link and phone you later. Cheerio!”

The line went dead. The professor saw that the phone call had lasted exactly 11 seconds. In that time the hacker had also sent him a link to a Github document which had been uploaded a few hours earlier.

Decentralized Privacy-Preserving Proximity Tracing

The document was prepared by an international group of scientists working on a technique they refer to as Decentralized Privacy-Preserving Proximity Tracing. This technique offers a way to help the government combat the Corona pandemic with as minimal an impact on a user’s privacy as possible. Based on privacy-by-design. In short, the app works as follows:

Once someone installs the app, a unique number is assigned to the app on their smartphone. This number is not tied to an IP address or WiFi-MAC address, IMEI number, phone number or any other data of their device. The app enables this person’s phone to be identified by other app users via Bluetooth. (It should be noted that a smartphone regularly changes its Bluetooth Device Identification Profile, i.e. Bluetooth ID). As such, the app sends out Bluetooth IDs, while at the same time receiving Bluetooth IDs from other phones nearby. Any Bluetooth IDs that are received are stored on the app user’s phone for a limited period.

Whenever an app user gets infected with the coronavirus, they can notify a server of this via the app. The server then forwards just the existing Bluetooth IDs already sent by the infected user’s app to other app users within a predefined time frame. There, these IDs are compared with the Bluetooth IDs that have already been stored. In case of a match and if that’s above a certain threshold (e.g. based on signal strength or the duration of the contact), the app then displays a message that (the phone of) the user has been in contact with (the phone of) someone who has been diagnosed with COVID-19.

Key privacy aspects

The protocol guarantees several key privacy aspects. Firstly, only Bluetooth IDs are stored (and no location data). Moreover, these are stored in a decentralized manner. Consequently, there is no place where all the Bluetooth IDs converge and could potentially be analyzed. There is a single central server. Yet only the unique numbers of the app users are stored there. Along with the (temporary) Bluetooth IDs sent out by someone’s smartphone in the event they have become infected.

The professor was extremely enthusiastic about this initiative. He immediately phoned his acquaintance the politician to ask if he already had heard about this. But the professor couldn’t even ask his question before the politician stated emphatically: “I won’t allow this crisis to be used to launch apps that allow the government to track people!”

The professor tried to point out: “Well, that’s actually -”

“Before you know it, a citizen will turn into a walking antenna for the government,” the politician interjected. “I demand that the government test these apps thoroughly – seeing that it concerns the privacy of our citizens!”

The professor was slightly dismayed that the politician appeared not to be at all interested in the technique behind the app.

And the government says …

The Dutch Minister of Health, Mr. De Jonge, gave a press conference a few days later. He declared that he wanted to use apps to combat the spread of the coronavirus. “However, that’s only possible if we treat privacy with the utmost of caution,” he stated. “This can only be done if we have the confidence of the general public. And this calls for crafting this carefully in more detail. I hope to be able to tell you more about that soon.”

The first questions followed less than two minutes later. What surprised the professor was that they did not focus on the underlying technique. Nor on the question of whether the effectiveness of a proximity app in combating COVID-19 was at all substantiated. They did not ask about privacy-by-design. Nor about the correlation between the proximity of phones and the spread of the virus. Instead, questions with preconceived ideas were asked, interspersed with weighty terms such as “surveillance state”, “privacy watchdog” and (it was the second time that the professor had heard this expression) “walking antennas.”

It troubled him. Navigation apps like Google Maps, Flitsmeister and Waze have real-time access to the location and travel movement of millions of Dutch people. Whatsapp/Facebook has an up-to-date and very accurate picture of our entire social network. Including our contacts who don’t even use Whatsapp. We also take it for granted that all our device and connection data (such as phone number, IP address and device IDs) are being registered and stored. When installing the apps from these companies, consumers apparently consider privacy to be secondary to convenience.

Is privacy secondary to convenience?

“We hand over our sensitive private data to companies such as Google, Facebook or Whatsapp. Which all have just one goal in mind and that is to maximize their profits,” the hacker said.

“But when it comes to our national health, we are extremely cautious and apprehensive where our privacy is concerned,” the professor added.

“In both cases, we react out of ignorance, I’m afraid,” sighed the hacker.

All of a sudden, the professor figured out what he had wanted to say to the politician.
“Of course, when using government-backed apps, we have to look closely at the impact that they have on our privacy…”
And if the politician had not interrupted him, he would have liked to add to that:
“But if we are as critical now about our privacy as we are about Whatsapp, then we could very quickly gain a much clearer picture of the global spread of COVID-19, what with more than 1 billion users.”

Except, however, he hadn’t said that.

About this column

In a weekly column, alternately written by Bert Overlack, Mary Fiers, Peter de Kock, Eveline van Zeeland, Hans Helsloot, Lucien Engelen, Tessie Hartjes, Jan Wouters, Katleen Gabriels, and Auke Hoekstra, Innovation Origins tries to find out what the future will look like. These columnists, occasionally supplemented with guest bloggers, are all working in their own way on solutions for the problems of our time. So tomorrow will be good. Read all previous articles in the series here.

 

About the author

Author profile picture Dr. Peter de Kock is adept at combining data science with scenario planning to prevent crime and enhance safety. De Kock graduated as filmmaker from the Film Academy of the Amsterdam School of the Arts where he mastered the art of creating scenarios for feature films and documentaries. After receiving a master’s degree in Criminal Investigation at the Police Academy, he was offered a position within the Dutch National Police force where he served as acting head of several covert departments. Within this domain he was able to introduce (creative) scenarios to anticipate and investigate crime. In 2014 De Kock combined art, criminal investigation, and data science in his dissertation "Anticipating Criminal Behaviour" with which he earned his Doctorate at Tilburg University. De Kock is founder and director of Pandora Intelligence, an independent security company specialized in security risks. The company uses a scenario-based approach to discover narratives in unstructured data, which helps (non) governmental organisations to mitigate risks and enhance opportunities.