More and more governments and companies are leaning on artificial intelligence and algorithms to help make decisions. In the process, however, carelessness and the invasion of privacy are things that must be avoided. Recently, the Dutch Research Council (NWO) awarded 21.3 million euros to the ‘The Algorithmic Society’ project which several Dutch universities are involved in. Automated decisions in the judicial system and hospitals, for example, are being put under a magnifying glass.
Most people have heard about the scandal that hit The Netherlands: the childcare benefits affair, where the Dutch Tax and Customs Administration infringed on human rights by using algorithms that led to faulty ethnic profiling in the monitoring of childcare benefits. This example clearly shows that taking ethical considerations into account when applying algorithms and artificial intelligence is indispensable for the healthy functioning of professional organizations. After all, harrowing situations can arise when too much reliance is placed on mere computer results and bureaucracy within an organization.
“People are pretty quick to trust systems and that they do what they’re supposed to do. But do they also know on which basis that data is made available to them? This is often not the case,” says José van Dijck, university professor of media and digital society at Utrecht University (UU). She is involved in the new project from the UU. “We want to investigate how we can safeguard important values such as privacy, equality and security.”
Pooling resources
The project lasts ten years and is led by UvA University Professors Natali Helberger and Claes de Vreese. Other participating institutions, in addition to Utrecht University, are Erasmus University Rotterdam, Tilburg University and Delft University of Technology. Van Dijck hopes that the research can, among other things, contribute to raising the awareness of professionals who have to deal with automated processes on a daily basis. “How can we make sure that they remain constantly alert when it comes to safeguarding public values? By looking closely at how the rollout of algorithms is going in different sectors of society, we can eventually bring this into focus.”
Human rights
Utrecht University is to receive about 3.5 million euros from the overall budget. One of the projects the money will be spent on is the Impact Assessment Human Rights and Alrogithms (IAMA), which was developed by Janneke Gerards at the Utrecht Data School. She and her colleagues created an instrument, a kind of manual that supports organizations in the decision-making process around the development and utilization of algorithms. Step by step, discussion points are outlined that need to be addressed before an algorithm is implemented.
When using the IAMA, one of the first questions that policymakers should ask themselves is what the concrete objective of the use of algorithms is. It also specifies that people should always have the freedom to reject the decisions of the algorithms. By shining light on how a careful decision-making process of algorithms proceeds in this way, the manual can help prevent problematic situations, such as the benefits affair.
Research into the use of algorithms within governing bodies is just one of the focus areas within The Algorithmic Society project. Van Dijck is excited about the different areas of expertise from the universities that are being brought together in the project. “In Utrecht, we focus on governance issues, but in Amsterdam the focus is on effect research on algorithm applications, while in Rotterdam there is a special focus on applications in healthcare. The research concentrates on three sectors (media, justice and health) but is broad in scope.”
A moving target
One of the challenges scientists face in conducting research is the ever-evolving nature of technology in society. This makes it difficult to predict how algorithms will be used in the future. “Take voice assistants, for instance. These are being used more and more in schools and hospitals. Who knows what effect this will have on the way we deal with algorithms in practice.”
“We are studying a moving target. Consequently, we constantly have to keep our eyes open for changes.”