The European General Data Protection Regulation (GDPR) will be evaluated this year. Yet it was originally drafted at a time when IoT, AI, smart sensors or cloud services were virtually non-existent. As in the past, the GDPR will likely not take the latest technological developments enough into account. This is what legal expert Jeroen Terstegge believes. He is calling for a different approach.
Terstegge is considered a leading expert in the field of privacy legislation. For example, the partner of Privacy Management Partners was consulted when the General Data Protection Regulation (GDPR) was being drawn up. “The core issue is that the GDPR states that exceptions must be made for private individuals. Even though they themselves are increasingly sending more data to online services.”
That’s why he doesn’t think the law makes much sense. “The history of privacy legislation began with the digitalization of organizations. Data has a role to play in business with processes such as HR and client administration. But we, as consumers, are also digitalizing our own lives. For example, with cloud services for photos or music. We determine which data we process ourselves, not the companies where we send the data to. And that’s where the problem seems to be. If I do something with your data and Facebook, then the GDPR applies to Facebook – but not to me.”
Data minimization
Terstegge contends that the GDPR requires, among other things, that companies undertake data minimization. Companies, governments and other organizations are not allowed to collect more data about us than what is deemed necessary. That, while the current fourth industrial revolution is actually generating even more data.
Terstegge: “The fact that the GDPR is not applicable to private sector activities prevents companies from realizing data minimization. Many of those online services are free of charge. Therefore, many companies look for ways to monetize this mountain of data. This then leads to profiling and trafficking in your data. This is also sometimes referred to as Surveillance Capitalism. Which also blurs the boundary between the private and the public domain. If you search the internet for a disease, you share your secret with that search engine and it is then no longer private. Your secret is stored, analyzed, packaged into a profile and sold.”
The legal expert isn’t satisfied with the current permission solution – that familiar ‘I agree‘ tick box. After all, how many people are actually going to read all those terms and conditions that accompany that question? And do they really understand the risks involved?
Innovative companies want clarity
Innovative companies cannot cope with the vague provisions of the GDPR. Terstegge: “Artificial intelligence, AI, is becoming more and more intrusive in our lives. Whether it’s a self-driving car, hiring or appraising personnel or smart facial recognition of criminals. The GDPR prohibits computers from making any decisions that would adversely affect us which does not have any direct human involvement.”
“The legislator may make exceptions to this rule. Provided it contains appropriate safeguards for our privacy. But the problem is that this legislation doesn’t exist (yet). Companies that invest in new digital solutions, such as artificial intelligence, are in need of clarity about the GDPR’s scope.”
“Now it’s a bit like playing soccer in the fog. You have no idea where the lines are. As an ”omnibus bill” the GDPR will never really clarify where those lines are. That uncertainty about what is and isn’t allowed by the GDPR has a paralysing effect on some companies. In other cases, it leads to indifference. At best, it tends to cause delays.”
Rules per sector
In order to do something about the pervasiveness and outdated nature of the GDPR, Terstegge thinks that the law should play a less prominent role. He thinks it is wiser and more in line with reality to draw up rules per sector or theme as to what is and what is not allowed where personal data is concerned.
He cites a recent alcohol test at the workplace that was prohibited by the Dutch Data Protection Authority (Dutch DPA, aka Autoriteit Persoongegevens, AP). The Dutch DPA came to this conclusion on the grounds that healthcare-related data must not be disclosed under the General Data Protection Regulation. However, the question of whether an employer can legally obligate an employee to undergo such a test is not a matter for the GDPR, but rather for Labour law. If the alcohol test is permitted under the Labour law, then the associated processing of personal data must also be permitted under the GDPR. However, the GDPR may impose additional conditions, such as security or the right to view relevant dossiers. (Incidentally, the Dutch cabinet recently announced that it was working on legislation that would make such a test possible, ed.)
GDPR: Law of Everything
Terstegge: “The GDPR is gradually becoming the Law of Everything, as everything we do in daily life leads to the logging of personal data. That’s a worrying trend, because the GDPR was never meant to function that way. What’s more, you’re asking too much from the regulators. How we want to structure our digital society and what is and isn’t allowed in it, is the task of legislators. Are ”smart cities” allowed to track our phones on the street? Can we hang surveillance cameras which use facial recognition so we can track down criminals? Is it legal to enter without our consent our medical data into an algorithm for medical research?”
“These are all questions that legislators need to address and, where necessary, make some concrete rules. The Dutch DAP only play their part in this when those rules are implemented in practice. All too often I see politicians hiding behind the GDPR and its regulators when it comes to digital issues”.