Siri Data Analsysis by Humans

Excerpt from a post on Privacy published on Chambyte on 22 July 2019.

Excerpt from a post on Privacy published on Chambyte on 22 July 2019.

A week or so after publishing a post about Apples Privacy Stance in which I stated the reason why I trust the tech giant in defending our right to privacy, UK publication The Guardian published an article about how Apple contractors ‘regularly hear confidential details' on Siri recordings.

The Guardian:

Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.

Apple says the data “is used to help Siri and dictation … understand you better and recognise what you say”.

But the company does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings.c

Not the first time such a revelation came to light, as Apple Analyst Rene Ritchie points out on Twitter; Bloomberg published an article back in April of 2019 which cited Apple's use of human helpers to listen to and assess Siri data.

A citation on a publication is not enough, however. Apple can and should do better in explicitly disclosing the use of sub-contracted human helpers in this process.

Relating to Siri - On Apples Privacy Policy page under the Collection and Use of Non-Personal Information:

We may collect and store details of how you use our services, including search queries. This information may be used to improve the relevancy of results provided by our services. Except in limited instances to ensure quality of our services over the Internet, such information will not be associated with your IP address.

The glaring omission here; no explicit mention of who analyses this data in its pursuit of improving the relevancy of results, and in continuing its tradition of accountability, Apple should rectify and update this omission on its Privacy documentation.

With their journalistic responsibilities, The Guardian et al., are right in publicising this distinct lack of disclosure along with any potential possibility of misuse of such data by people trusted to examine it, regardless of how anonymous the data is. The Guardian’s sex and drug lede designed to entice readers and raise undue concern rather than taking the educative approach is of no surprise, however.

Sex and Drugs sell in the world of tabloid headline-grabbing 'news' for clicks.

AppleHabibPrivacy, Siri