@Channel4
Author profile picture

Ever since the conception of the World Wide Web, people have feared that they would no longer be able to tell the difference between what is ‘real’ and what is ‘virtual.’ For a long time, this fear could be dismissed as a form of ‘moral panic,’ yet the emergence of deepfakes has ushered in a new era. Deepfake films have regularly surfaced online in recent years, such as Queen Elizabeth’s Christmas speech last December. With the use of lip-sync techniques, it seemed as if the Queen herself was speaking.

The term ‘deep’ refers to deep learning techniques, which form part of machine learning (a technique for developing or implementing artificial intelligence (AI)). Deep learning alludes to artificial neural networks that search for patterns and complex relationships within datasets at deep and hierarchical levels, just like the neural structures in our brain do. In the case of deepfakes, these techniques are used to merge different existing image sources or to create new images (bodies, faces) from them. Deepfakes are not necessarily negative. Internet avatar Sweetie, for example, was created for the purpose of tracking down online pedophiles.

Cheap fake

Nevertheless, there are often malicious intentions behind it. In the summer of 2019, a fake porn video surfaced of the Dutch newsreader Dionne Stax. Ms Stax’s head was attached onto that of a naked woman’s body and the video was then distributed via Pornhub. Having a grasp of technological knowledge is becoming increasingly unnecessary to be able to make deepfakes yourself; the low-tech and more accessible variant is also known as ‘cheap fake.’

It is precisely because less and less sophisticated AI and no specific training is needed to create deepfakes that the problem will not simply disappear anytime soon. AI will continue to be exploited by criminals in ever-increasing ways. Last summer, researchers at the University College London published a list of 20 AI applications and technologies used for criminal purposes. Audio and visual abuse ranks highest as a ‘high concern crime.’

Philosophers message

Manipulation goes beyond images: It can also involve sound (synthetic voices), such as new forms of phishing where a person receives a phone call from an engineered voice (e.g., an acquaintance), asking them to transfer money. Although philosophers René Descartes and David Hume lived their lives long before the advent of deepfakes, their message has lost none of its relevance: Doubt. Be skeptical. Your senses can deceive you.