Yesterday a colleague showed me a deep fake clip ripped from Home Alone. Both the face and voice of the lead actor Kevin were replaced by that of Sylvester Stallone – ‘Home Stallone.’ If you look around for examples of this deep fake technology, you’ll see people like Hillary Clinton and Vladimir Putin featured in videos where it’s really hard to distinguish what’s real from what’s fake. The documentary ‘The Great Hack’ reveals how the targeted fuelling of fake news to specific audiences via online channels has had an impact on election results in several countries. While these are, of course, worrying signals, the bottom line has actually always been the same for me. People are susceptible to influence. That’s the way it is now, but that’s basically been the case all along. In that respect, it’s nothing new.
Sometimes when I read the warnings and listen to people talk about this kind of technology, I hear two messages: These day we are pretty far ahead technologically speaking. Yet it could be dangerous and turn against us. Both are true, obviously. But this is certainly not the first time in history that humankind has thought like this about themselves or particular technologies. It’s called “technophobia.”
With the advent of Google, we feared that children would no longer learn anything. When the gramophone arrived, the fear was that future generations wouldn’t bother learning to read anymore. And once printed books were invented, people feared that the world would be torn apart by all the subsequent lies. After all, previously stories were told to each other and one could immediately quiz the storyteller and gauge whether they were telling the truth or not. Everything could be recounted in written texts, yet no one could ever really ask questions about what the actual truth was.
Slanderous campaigns have been part of American politics since the dawn of the United States. Thomas Jefferson hired a writer who deliberately spread ‘fake news’ about John Adams to eventually succeed in influencing the public. It is not an unknown phenomenon within the business world either.
Consider the battle between Edison and Tesla. Edison had built an entire empire around the use of direct current and earned a lot from all the royalties that were being paid for the use of his technology. He had a lot to lose when Tesla and Westinghouse were advancing the use of alternating current. To protect his market, Edison started slander campaigns by publicly electrocuting animals, including dogs, cats and even an elephant. He tried to convince the public that alternating current was dangerous. Ultimately it didn’t work, AC power is still dominant in the world to this day.
Personally, I do believe everything will turn out alright in the end. New types of professions are likely to emerge. There’s bound to be more regulation. But we as a society will find a way to deal with all of that. People are able to adapt very quickly. The means have changed over the centuries, but the basic principle of how influence is exerted has not changed.
The most effective weapon boils down to increasing awareness along with knowledge. Be mindful of these kinds of technical developments and always be on the lookout for the truth. Try to figure out what the source is and which interests are being served in bringing something to light. The exchange of information may be getting faster and faster, but in 1776 it was incredibly difficult to find out whether or not John Adams really did want to attack France. Regardless of whether you heard that via a spoken rumour or caught it in a deep fake movie.
About this column:
In a weekly column, written alternately by Tessie Hartjes, Floris Beemster, Bert Overlack, Mary Fiers, Peter de Kock, Eveline van Zeeland, Lucien Engelen, Jan Wouters, Katleen Gabriels and Auke Hoekstra, Innovation Origins tries to figure out what the future will look like. These columnists, occasionally joined by guest bloggers, are all working in their own way on solutions to the problems of our time. So that tomorrow is good. Here are all the previous articles.