© Pixabay
Author profile picture

Burglars like to strike during Christmas and around the new year. Mathematical psychologist Dick Willems came up with a solution to make it a lot harder for thieves. He developed a system that can tell in advance where the likelihood of burglary is the greatest. It seems to be working, because recently the police in the Dutch town of Spijkenisse caught two burglars red-handed by using it.

These are the times when burglars like to strike

On Christmas Day, most burglaries take place between two and four in the afternoon, while burglars are active slightly earlier on Boxing Day: around one in the afternoon. On New Year’s Eve, it tends to be between five and eight o’clock in the evening, while on New Year’s Day, the most attempted burglaries take place around one o’clock at night as well as around noon, according to research.

The Crime Anticipation System (CAS) works on the basis of data mining. This system analyzes data such as the most common burglary methods, times and addresses. But also on civilian police reports and the number of insurance claims per neighborhood. The aim is to turn this data into actionable insights that show where and when burglars want to strike. Each week, the system generates a digital map showing where the probability of burglaries is higher than normal.

Just because the police are stepping up surveillance in high-risk areas does not necessarily mean that we can now all leave our homes without having to lock windows and doors. There are also now scores of innovations that make homes just a little bit smarter. Think of motion sensors, glass break detectors or smart security cameras. Or a magnetic contact, which alerts you via an app if a window or door is still open.

Discrimination against minorities

According to critic Marc Schuilenburg, however, we should not be fixated on CAS. Schuilenburg maintains that there is quite a lot of so-called contaminated data in the system. Schuilenburg is professor of Digital Surveillance at Erasmus University and assistant professor of Criminology at the Vrije Universiteit Amsterdam.

“There is also a downside to predictive policing. Consider the risk that investigative authorities will continue to focus on the same neighborhoods and types of crime. Equally concerning, using data and algorithms to predict crime can lead to discrimination against minorities and the identification of high-risk groups based on certain characteristics. We call this social sorting,” Schuilenburg points out.

The Netherlands Court of Audit has also called it into question. The use of this algorithm could lead to individuals being suspected without even having done anything wrong. The police deny this: “The algorithm is not used on individual members of the public, but exclusively to predict crime in residential areas,” the police state in a response.

Other policing systems

Systems such as these have been the subject of discussion before. For example, SyRI, the benefit fraud detection system, or ClearView AI. This is the controversial facial recognition software that police have used at least 50 times, according to American media. The police denies that this software has been used.

The police used ClearView AI to track the movements of civilians even if they were not suspected of anything. The application collected photos from the Internet and then searched for personal data to go with these. As a result, police can identify anyone who has a picture of themselves on LinkedIn or Facebook, for example, whenever they walk past a ClearView AI-equipped surveillance camera on the street.

“If predictive systems like CAS are to develop into fully-fledged and integrity-based detection methods for the government, a basic prerequisite for this is that the data being used is ‘clean’ and also that the objectivity of the data can be verified,” Schuilenburg says.