Author profile picture
Why we write about this topic:

Digitalization is much needed and brings many good things to society. Yet there are also drawbacks, such as the risk of digital discrimination. In his inaugural lecture as a professor at Radboud University, Frederik Zuiderveen Borgesius elaborates on this.

Dutch law still insufficiently protects people against digital discrimination. This is what Frederik Zuiderveen Borgesius, professor of ICT and law at Radboud University, argues today during his inaugural lecture. Whether you are taking out a new car insurance policy, applying for a mortgage, passing through security at Schiphol Airport, or booking a vacation: daily, you have to deal with digital systems that make decisions about you, often without you having any influence over them or even realizing it. Those decisions can have a significant impact on your life: you can be removed from the queue at Schiphol Airport or pay a few dozen of euros more per month for your car insurance.

Not ready yet

Numerous parties deploy ‘digital differentiation, which can put people at a significant disadvantage. “Dutch law is not yet ready for the choices sometimes made by these algorithms,” warns Zuiderveen Borgesius. “We have clear rules within the law that protect people against discrimination based on skin color, gender, or similar characteristics. Those standards also apply to discrimination by computers. But digital discrimination is often difficult to detect.”

Distinction by zip code

Zuiderveen Borgesius: “The second category of problems is even trickier. Digital differentiation can be unfair when it does not specifically affect people with certain skin color or similar characteristics.” For example, some insurers charge higher premiums to people who live on a house number with a letter, such as 1A or 90L. That can be unfair. Digital differentiation can also affect poor people. “Think of an energy company charging higher advances to residents of a ZIP code where arrears are more common. With such an approach, you hit especially people in poor neighborhoods. And if people always pay their bills neatly, it is sour for them if they are treated as a defaulter just because of their zip code.”

Additional rules

In his speech, Zuiderveen Borgesius elaborates on the legal snags. Indeed, there are quite a few: discrimination against poor people or people with a specific house number, for example, is not prohibited. It is also sometimes challenging to determine which data a computer uses to make choices and how we can make that transparent. “Current law can only partially protect people from unfair digital differentiation. For example, there are useful rules in general non-discrimination law and the General Data Protection Regulation (gdpr). We need to start enforcing those rules better. But the current rules also leave gaps. Additional rules are needed.”

New standards

And that’s tricky. For some digital differentiation, we don’t yet know what the standards should be. Should discrimination against poor people be prohibited? And in which cases? To answer questions like that, more interdisciplinary collaboration is needed by computer scientists, lawyers, philosophers, and researchers from other disciplines, Zuiderveen Borgesius said. 

Collaboration

This story is the result of a collaboration between Radboud Universiteit and our editorial team. Innovation Origins is an independent journalism platform that carefully chooses its partners and only cooperates with companies and institutions that share our mission: spreading the story of innovation. This way we can offer our readers valuable stories that are created according to journalistic guidelines. Want to know more about how Innovation Origins works with other companies? Click here