X
Tech

Apple, Google: We've stopped listening to your private Siri, Assistant chat, for now

Google and Apple have suspended programs where employees and contractors review recordings made from the companies' respective voice assistants.
Written by Liam Tung, Contributing Writer

Apple has suspended a program that sends its contractors recordings of Siri users' queries to check whether the assistant is interpreting them correctly and whether it is being activated by mistake. 

The move follows a report by The Guardian last week based on interviews with a whistleblower who revealed that contractors for Apple regularly hear Siri recordings that contain highly sensitive information. 

The subjects involved in the recordings include sexual encounters, drug deals, and discussions between doctors and patients. 

SEE: Amazon Alexa: An insider's guide (free PDF)

Apple runs the so-called 'grading' program to determine if Siri correctly interprets user queries and whether it's being inadvertently awoken. The snippets of audio are not linked to user IDs.     

Apple issued a statement to TechCrunch on Thursday night, explaining it had suspended the grading program worldwide while it conducts a review. Arguably, as Apple should have done in the first place, it will provide a software update that will then allows Siri users to opt in to the program. 

"We are committed to delivering a great Siri experience while protecting user privacy," Apple said. 

"While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."

Apple makes a point on its privacy brochure that it doesn't sell personal information to advertisers. Its terms and conditions do state that recordings are taken to improve Siri, but not that they're sent to humans for review.  

Apple told The Guardian only a small portion of Siri requests are analyzed for the purpose of improving the technology. The responses are analyzed in secure facilities by reviewers who are obliged to adhere to Apple's confidentiality requirements. The company estimated "less than 1% of daily Siri activations" are used for grading. 

Also: Apple's Siri laughs off abuse, but should she?

Still, the incident doesn't look good for a company that differentiates itself from other tech giants based on its higher standards of end-user privacy. 

News of Apple's human review program follows a report from Belgium that Google employees are "systematically listening" to audio files recorded by Google Home smart speakers and Google Assistant on Android smartphones.   

Google is putting that evaluation program on hold in Europe for three months as of today. Associated Press reported on Thursday that Google has told Hamburg's commissioner for data protection, part of Germany's BfDI privacy watchdog, that it would stop making transcripts of speech data recorded from Google Assistant for three months in the EU. 

The Hamburg data-protection office on Thursday announced that it had launched an action to prohibit Google and its contractors from evaluating audio clips from Assistant for a period of three months. 

Google told the watchdog that, throughout the administrative procedure, transcripts of voice recordings would be halted at least three months from August 1, 2019, on an EU-wide basis. 

SEE: IT pro's guide to GDPR compliance (free PDF)

Johannes Caspar, the Hamburg commissioner for Data Protection and Freedom of Information, had "significant doubts" as to whether Google Assistant complied with Europe's GDPR data-protection laws.  

"The use of language-assistance systems in the EU must follow the data-protection requirements of the GDPR. In the case of the Google Assistant, there are currently significant doubts," said Caspar.

"The use of language-assistance systems must be done in a transparent way, so that an informed consent of the users is possible. In particular, this involves providing sufficient information and transparently informing those affected about the processing of voice commands, but also about the frequency and risks of mal-activation. 

He said due regard must be given to the need to protect third parties affected by the recordings. 

"First of all, further questions about the functioning of the speech-analysis system have to be clarified. The data-protection authorities will then have to decide on definitive measures that are necessary for a privacy-compliant operation."

More on Apple Siri and Google Assistant

Editorial standards