“Apple listens to sex via Siri” has been headlining the media a lot over the past week. So, Apple joins Google and Amazon, the illustrious list of companies that listen in on our lives and are learning from us. Just like an undercover agent passes on information to their client, the undercover assistant is infiltrating our lives.
For years the big tech giants have been engaged in an epic battle to offer a digital assistant. The ultimate goal is to develop a personal assistant that makes the user’s life easier by carrying out certain tasks. But should it no longer be a secret that the real goal of these companies is to better understand their users, so as to gain a commercial advantage.
Breaking the rules
The problem that surfaced last week concerns voice activation or voice control. In order to be able to perform a task, the voice-activated assistant listens to the user (after all, how does Apple know that you said “Hey Siri” if it doesn’t listen in?). As soon as the assistant recognizes the catchphrase, the system is activated. This can lead to a situation in which the system ‘thinks’ it hears the catchphrase, even though it hasn’t been voiced. The system then starts recording, although it does not discover a question or an assignment. Well, in order to improve speech recognition, some of these soundbites are analyzed by humans. Privacy organizations are now examining the question of whether these rules have been violated when it comes to the use of personal data.
But the underlying question is much more pertinent. Are we sufficiently aware of what we are granting the Digital Assistant access to?
The telephone manufacturer has our biometric data at their disposal for unlocking our device. The messaging service has a very accurate and up-to-date picture of our contacts. The navigation app knows who lives where, and who has a relationship with whom. The social media company is able to predict our behaviour better than our partners, and the smart wearable collects health data and sports performances on a server. Lastly, our search engine knows our deepest fears and desires.
In return for a little bit of convenience, we hand over en masse sensitive or very specific (personal) data to commercial companies. We do this more or less consciously, and in accepting the terms and conditions, we take the privacy risks for granted. But the Digital Assistant is designed to connect data from various applications. This creates a risk that is not only greater than the sum of its parts, it transcends the individual user and has far-reaching consequences for (the privacy of) others.
Diary access
Take the following, hypothetical example: When I ask the digital assistant “What’s my next appointment?” that question can only be answered if the Assistant already has access to my calendar. This includes appointments with non-users. For example: Sanne’s Birthday Party. In itself, this is not sensitive information. But when I then give the command “Navigate to Sanne de Vries”, the assistant gets her address and links it to the navigation app. When I instruct the assistant “Pay 50 euros to Sanne for the trip to Sudan”, payment details are linked. And when later in the evening I ask the assistant to send Sanne the text – “I still can’t believe you’re going to be a mother,” her mobile number is accessed from my contact list.
Interaction with the digital assistant produces an image of Sanne, whereby her name, date of birth, address, mobile number, financial data, travel plans, and even information about her health are all brought together. And of course the digital assistants do not work in ‘splendid isolation’. They form a global network and learn from each other. The Alexa’s, Siri’s, Cortana’s and Google Assistants provide their real bosses with an increasingly accurate picture of the lives of their users, but also indirectly of the lives of the relationships of these users.
Mission accomplished
Creating and ‘running’ an undercover agent is time consuming and very expensive. When the value of the information exceeds the costs, the mission is successful. A similar calculation applies to the digital assistant. We have to realize that assistants which are offered under the pretext of being ‘free’ are created and run by commercial companies. And those companies make a lot of money from information about your life and mine.