The European Union is desperately looking for 20 IT experts. Knowledge of programming languages, Artificial Intelligence, Internet security, and the use of online algorithms is a requirement. Also, a warm feeling for the European Union’s vision that an open and fair Internet is a basic need for European citizens is needed. The location for this team is not dreary Brussels but southern Spain’s Seville.
In that place, a brand new knowledge center of the European Commission will be established: the European Centre for Algorithmic Transparency (ECAT). A total of 30 digital experts will work there, 10 of whom have already been hired. In collaboration with external scientists, they will have to provide the knowledge that supervisory officials in Brussels need to rap the knuckles of Internet companies that cross the transparency line.
The creation of the center of expertise follows two new laws that took effect this week: The Digital Services Act and the Digital Markets Act. Together, these are intended to, in the words of the European Commission, “ensure that Europeans can more safely navigate the digital world”.
The Markets Act is intended to ensure that large Internet companies (the Commission calls them “the gatekeepers”) cannot abuse their power to manipulate the online market to their advantage. In this way, European politicians hope to ensure that companies offering online services have equal access to the dominant platforms.
Oversight needed on the use of algorithms
The Services Act does something similar and aims to regulate the supply of online content. The law establishes as a core value that what is prohibited in the offline world is also illegal online. That means, for example, that market manipulation is not allowed.
The European Union is concerned that at the heart of just about all online platforms and search engines, automatic algorithms make the decisions. Not ourselves, but computer programs increasingly determine what movies we get to see, what music we listen to, and what news stories we read.
Algorithms thus have a great influence on our daily lives. According to European politicians, this is far from always a problem, but often it is. With the new legislation comes the obligation for large online platforms to test their algorithms for possible “systemic risks”.
If a heartbroken teenager, for example, watches a few pathetic videos and then is only fed more sadness by the algorithm, heartbreak can turn into depression. In Brussels jargon, this kind of situation is called: ensuring that vulnerable users do not come into contact with harmful content.
The fact that in political discussions, people seem to understand each other less and less is also blamed on algorithms. Each side of a conflict, whether coronagraphs, the war in Ukraine, or the usefulness of another Donald Trump presidency, consequently lives in its own world (or Internet bubble). Unnoticedly, computer programs of commercial companies thus manipulate people’s political consciousness.
With the Digital Service Act in hand, the European Commission wants to combat malicious algorithms. Officials have the right upon suspicion that a platform is over the line to request the software being used to study it.
High social relevance
This examination is the work of the 30 IT experts in government service. They have to provide the evidence that enforcers need to hand out fines.
That it might be challenging to find 20 or so IT experts in this tight labor market is something the European Commission does not see. After all, these are jobs with “high social relevance,” says a Commission official.