“Hey Siri, are you recording our conversations?”
Probably, this time Siri will give you a diplomatic answer or else will Google something for you. Sounds funny, right? No, it doesn’t.
A couple of months before, we found out Amazon was listening to your conversations with Alexa and similarly Google with Google Assitant, this time it is Apple who is recording your confidential conversations with Siri. It was reported that Google and Amazon employed workers to listen to voice recordings extracted through Echo devices with some of them even having the power to access a user’s address and contact number. Well, how Apple cannot be on the list? Bingo!
According to The Guardian, human contractors are employed by Apple to review Siri recordings which contain confidential data of users. Guardian reported that Apple contractors regularly here confidential information, secret deals, and recordings of couples, as part of their job providing quality control, or “grading the responses”, the company’s Siri voice assistant. This accompanying information responses provided by Siri was helpful to the user and the number of times Siri was triggered unintentionally.
Apple confirmed The Guardian that, “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long. But how true is this?
How Guardian came up with the learning and reported this?
A whistleblower working for Apple, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information. The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”
An interesting thing here is, Amazon and Google allow users to opt-out of some uses of their recordings; Apple offers no similar choice privacy focused feature of disabling Siri yet. While these companies are recording your conversations for their AI but how frightening it sounds that a random individual is listening to your private conversations. And how your smart assistants are making you smart and your life easy? Time to think, without asking your mobile assistant.