Ever since the conception of the World Wide Web, people have feared that they would no longer be able to tell the difference between what is ‘real’ and what is ‘virtual.’ For a long time, this fear could be dismissed as a form of ‘moral panic,’ yet the emergence of deepfakes has ushered in a new era. Deepfake films have regularly surfaced online in recent years, such as Queen Elizabeth’s Christmas speech last December. With the use of lip-sync techniques, it seemed as if the Queen herself was speaking.
The term ‘deep’ refers to deep learning techniques, which form part of machine learning (a technique for developing or implementing artificial intelligence (AI)). Deep learning alludes to artificial neural networks that search for patterns and complex relationships within datasets at deep and hierarchical levels, just like the neural structures in our brain do. In the case of deepfakes, these techniques are used to merge different existing image sources or to create new images (bodies, faces) from them. Deepfakes are not necessarily negative. Internet avatar Sweetie, for example, was created for the purpose of tracking down online pedophiles.
Nevertheless, there are often malicious intentions behind it. In the summer of 2019, a fake porn video surfaced of the Dutch newsreader Dionne Stax. Ms Stax’s head was attached onto that of a naked woman’s body and the video was then distributed via Pornhub. Having a grasp of technological knowledge is becoming increasingly unnecessary to be able to make deepfakes yourself; the low-tech and more accessible variant is also known as ‘cheap fake.’
It is precisely because less and less sophisticated AI and no specific training is needed to create deepfakes that the problem will not simply disappear anytime soon. AI will continue to be exploited by criminals in ever-increasing ways. Last summer, researchers at the University College London published a list of 20 AI applications and technologies used for criminal purposes. Audio and visual abuse ranks highest as a ‘high concern crime.’
Manipulation goes beyond images: It can also involve sound (synthetic voices), such as new forms of phishing where a person receives a phone call from an engineered voice (e.g., an acquaintance), asking them to transfer money. Although philosophers René Descartes and David Hume lived their lives long before the advent of deepfakes, their message has lost none of its relevance: Doubt. Be skeptical. Your senses can deceive you.
Innovation Origins is an independent news platform that has an unconventional revenue model. We are sponsored by companies that support our mission: to spread the story of innovation. Read more.
At Innovation Origins, you can always read our articles for free. We want to keep it that way. Have you enjoyed our articles so much that you want support our mission? Then use the button below: