©Pixabay
Author profile picture

Quite a few innovative solutions for tackling the coronavirus have implications for our privacy. Technology offers a lot of possibilities, but do we want that at the expense of our privacy? Legal privacy expert Jeroen Terstegge, partner at Privacy Management Partners, responds to five issues that are currently under discussion.

An app that helps trace ‘corona traffic’ is in the news now. Is the Dutch government handling this properly?

“The focus of the discussion is on privacy and especially regarding your location data and security. But privacy is not the biggest problem when it comes to this kind of app. Personal data protection is about so much more than that.

As far as the corona app is concerned, stigmatization and discrimination pose a greater risk. An app of this kind is in danger of undermining our trust in each other. Do we still dare trust being in close proximity to each other? Or do we want to check each other’s app status first? In terms of the one-and-a-half-meter economy, how do we deal with people who, according to their app, are a contact risk and haven’t been self-isolating? Or people who don’t want to show their status? Or who don’t have this app on their phone? And what about the one million-plus Dutch citizens who don’t even have a smartphone?

Aside from being a medical problem, corona also risks becoming a social problem. Stigmatization and discrimination of people with a ‘mark’ could well become the new normal again, as was previously the case with AIDS, the plague and leprosy.

I had already mentioned that our biggest challenge with respect to personal data is how we keep on collecting more and more data about ourselves. This app is a good example of that. The GDPR (General Data Protection Regulation for the European Commission, ed.,) does not apply in the case of personal use of personal data. Like that of your list of private phone numbers on your phone. Much of the data processing of your phone by this app is unlikely to be covered by the GDPR for this very same reason. This also means that there is no oversight exercised by the Dutch Data Protection Authority.”

Isn’t it a matter of weighing things up, as in, an ounce less of privacy in exchange for an ounce more of health?

“Our fundamental rights always count, especially in times of crisis. As Louis Brandeis, the ‘inventor of privacy law’, said back in 1928: ‘Experience should teach us to be most on our guard to protect liberty when the Government’s purposes are beneficent.’ Indeed, in times of crisis and under strict conditions, the government can impose far-reaching restrictions on our fundamental rights.

The bigger the crisis, the more far-reaching restrictions are permitted to be. This also applies to our privacy. But then the government must comply with the associated strict conditions. For one thing, government actions must be legitimate, whether via an emergency regulation or not. This should be effective and proportionate. And any measures and their effects must be set out clearly and something the general public can anticipate.

Therefore, it is not a matter of privacy or public health. It is, however, ‘protection of public health with due regard to the rules on privacy restrictions’. If the government does not pay sufficient attention to these rules, you run the risk that the courts will rule against certain measures. As was recently the case in The Netherlands with SyRI (System Risk Indication), an AI system that was to be used for detecting welfare fraud.”

Speaking of the GDPR, in principle it applies to all sectors, doesn’t it?

“What the restrictive measures may entail varies from sector to sector and from subject to subject. Take, for example, Minister De Jonge’s proposal regarding access to electronic patient files without the patient’s explicit prior consent. I don’t really have a problem with that proposal. In the realm of health regulations, we are already familiar with the notion of ‘presumed consent’ when it comes to sharing medical data with, for example, a doctor’s assistant.

But at present this rule does not apply when it comes to sharing information with other doctors who are not involved with a specific treatment. That’s why you must give your consent whenever a doctor in the hospital wants to look at the medical file that your general practitioner has on you.

In this profession, we often refer to ‘contextual integrity.’ This is a very useful gauge for determining whether an infringement is acceptable. If during a medical crisis, doctors exchange your data with each other without your prior consent to provide you with the best medical treatment, contextual integrity is preserved. All that’s necessary for safeguarding your health.

This invasion of privacy is therefore less grave than if governments were to collect telecom data with a view to monitoring lockdowns. Contextual integrity is compromised here. In this case, however, such rulings would need to be lifted after the crisis. Or as a new exception to be enshrined in law for any future crisis, but only under strict conditions.”

©Privacy Management Partners

Working from home is on the rise. What about the employer who wants to keep a close eye on an employee via a camera link?

“This is yet another example where you often have to deal with other rulings besides those of the GDPR. In this case, domestic laws and labor laws are at play here too. Notably, the employer is responsible for ensuring decent working conditions. Including a proper chair suitable for working from home. Or that you take regular breaks if you work in front of a screen all day.

However, according to domestic law, an employer is not legally allowed to visit you to check up on you to see if you are complying with these rules. Asking your employee who is working from home to turn on their webcam all day is simply out of the question. I also get queries from schools as to whether they can stipulate that students switch on their webcams during exams. I think that’s quite acceptable.

In any case, education regulations seem to offer certain conditions under which this could be a prerequisite. So, in view of the transparency requirement, this should be properly set out in examination regulations.

There are more examples of what is technically possible which might conflict with privacy laws. For instance, requests to use thermal cameras to record the temperature of employees or visitors to a building have reportedly risen sharply. Then my initial reaction is: ‘Is this camera technically capable of detecting an increase of a few tenths of a degree in someone’s temperature?’

In terms of privacy, after all, the end goal must justify the means. And can the goal be achieved in such a way that the invasion of privacy is reduced, such as letting people measure their own temperatures? The GDPR first needs to address these questions particularly where biometric systems are concerned. Such as thermal cameras.”

Video chatting and video conferencing via Zoom raise concerns. Is this justified?

On paper, Zoom seems to have done a good job in terms of the GDPR. It has a standard hosting agreement for its business user. As well as a privacy statement which in the interim has undergone substantial changes. They are also registered under PrivacyShield for data transfers to the United States.

But even if something has been properly regulated on paper, that does not mean that it is in fact safe. Various privacy and security issues have come to light over the past few weeks. For example, there was a link to Facebook. Managers could use Attention Tracking to see whether the participants in the meeting had another screen open. Users suffered from Zoom-bombing (people breaking into Zoom sessions). Also, the encryption turned out not to be end-to-end after all, and so on.

As such, it’s a good thing they’re solving those problems promptly. But the multitude of problems Zoom deals with indicates that privacy and security are not in their DNA. For employers, it means that they must first carefully check all the apps and equipment that employees work with for privacy and security issues before they can use them. This is something that they usually do under normal circumstances. However, faced with this corona crisis whereby everyone had to quickly switch over, it turned out that those kinds of rules were the first to be broken. Yet the GDPR remains fully in force even in this crisis.