Upcoming Changes to Siri Privacy Protection

Siri.jpeg

We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.

Following the suspension of using human contractors to listen to Siri audio snippets for its Siri grading program to improve Siri's effectiveness in providing accurate responses to queries, Apple have now temporarily terminated the program and offered its apologies for failing to live up to their high ideals and upholding the level of privacy its users are accustom to.

However, the practice will resume in-house when upcoming software updates are released, and a few evaluation process changes have been made:

First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

I'm glad to see Apple continue to take ownership of its responsibilities in addressing the situation, offering its apologies, and putting forth changes that align with their strong privacy stance and the respect it has for its users. I will be sure to opt-in to help with improving Siri.

Source: Apple Newsroom.

AppleHabibApple, Siri, Privacy