So far, it has not been possible to create a properly functioning algorithm that can flawlessly recognize and discard deepfake videos that appear on the Internet. This is a cause for concern, according to Accenture consultants during the free information day High Tech Next in Eindhoven last month. The problem is that the AI used to make the fake videos is easy to apply, and is therefore also within reach for criminals. There are all kinds of apps that can be downloaded from the internet that a layman can use to make his own deepfakes.
One way to recognize deepfake videos is to look at the emotional expression of the face. “It often doesn’t match the content of the words spoken by the fake characters,” said Sven Koolen, data analyst at Accenture.
Increasing numbers of deepfake videos
For the past two years, the number of deep fake videos has increased enormously. People suffer as a result, said Buket Demirel, the consultant from Accenture who supervised the deepfake workshop. She cites the example of the Indian journalist Rana Ayyub, whose face was used for a porn video last year, causing her to lose her job, despite the fact that the video footage was completely fake.
Subscribe to our Newsletter!
Photo presentation Accenture High Tech Next Photo: Lucette Mascini
Another example was in September in the Wall Street Journal. It reported that a British director of an unnamed energy company had allowed himself to be fooled by a telephone call that was made using a deepfake voice that sounded exactly like that of the German director of the parent company.
Criminal tricks employee out of 220,000 euros
The employee then immediately followed the supposed boss’ instructions and initiated a bank transfer of of € 220,000 to a bank account in Hungary. But since the account actually belonged to a criminal, the money was lost to the company.
Deepfake videos are made with the help of AI which analyzes the structure of images of – mostly – a person and makes a code out of them. AI also creates a program to decode this so that the images made with this code can be mixed with the images of another character who does or says all kinds of things that the real person would never do.
Google and Facebook still have no cure
It is sometimes difficult to see if the video is real or fake because AI does its job so effectively. A few weeks ago a video of former American president Richard Nixon appeared on You Tube in which he says that the first landing on the moon in 1969 failed. People who didn’t experience this historical event – there are of course many of them now – and don’t know Nixon, might think it’s a real video. “This makes it look like you can rewrite history,” says Demirel.
According to Koolen, the big tech companies such as Facebook and Google are eagerly trying to develop algorithms that are capable of recognizing deepfakes and throwing them off as soon as they are unmasked. But so far this has not been possible.
AI becoming increasingly advanced
The problem is not just that deepfake technology is becoming increasingly sophisticated. Some deepfakes are intended as a parody and are easy for the viewer to recognize as such, so these do not need to be removed from the public sphere. For example, a video of Boris Johnson appeared before the British elections taking place on 12 December in which he praises his rival Jeremy Corbyn. Every viewer understands that Johnson would never do that in real life. But can you teach artificial intelligence? That’s the question.
No control on the internet
Another problem is that you can put deepfake videos on the Internet from so many countries that it is impossible to make a watertight system of rules that can prevent deepfakes or make them punishable, Koolen said. “The Internet is everywhere in the world. It is beyond our control.”
Innovation Origins is an independent news platform that has an unconventional revenue model. We are sponsored by companies that support our mission: to spread the story of innovation. Read more.
At Innovation Origins, you can always read our articles for free. We want to keep it that way. Have you enjoyed our articles so much that you want support our mission? Then use the button below: