BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Apple Contractors Hear "Sexual Acts" On Accidental Siri Recordings

Following
This article is more than 4 years old.

© 2018 Bloomberg Finance LP

Apple contractors may have heard your private conversations, and private acts, via accidental Siri recordings, according to a whistleblower. 

Speaking to the Guardian, the anonymous Apple contractor explained that their role is to “grade” the quality of Siri responses and check whether the voice assistant’s activation was accidental or not. 

Joining revelations about Amazon’s Alexa and Google’s Assistant, Apple also employs humans to do this work in order to improve the responses the smart personal assistant offers. But it also means that private audio clips are processed when Siri mistakenly thinks its wake word ("hey Siri") has been spoken, as the contractor explains:

“You can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

The contractor also said incidents of accidental recordings on the Apple Watch were “incredibly high”. Presumably because the Watch is supposed to be worn regularly, which increases the chances of picking up audio it shouldn’t be eavesdropping on. 

There are two major issues here; the first is that Apple doesn’t make it readily known that your Siri audio may be processed by humans. Apple’s own privacy explainer details that Siri data is sent to “Apple servers” to improve the quality of the service, but it doesn’t explicitly mention that humans are processing it, not does it mention third-party contractors. 

Secondly, the contractor said it wouldn’t be difficult to identify people by the recordings - depending on what’s said - and that there isn’t much vetting of who is hired to process these recordings. However, on the same privacy explainer site, Apple repeatedly makes it clear that it takes multiple steps to remove anything identifying in Siri data sent to Apple. 

“Analysis happens only after the data has gone through privacy-enhancing techniques so that it cannot be associated with you or your account.” The explainer reads. 

In a statement to The Guardian, Apple said that a random subset - less than 1% of daily Siri activations - were used for grading. It also said that “user requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Check out my project Misinformer: a gripping detective game based on real investigative journalism.

 

Check out my website