Apple is putting a worldwide hold on a program that had contractors listening to some Siri queries in an effort to grade the digital assistant on its responses. When the program returns, Apple says users will have the choice whether to participate.
Why it matters: Apple touts privacy as a key selling point, making the idea that someone might be listening to Siri queries unsettling, even if only a tiny fraction of queries were being monitored.
Driving the news:The issue came to light after aGuardian reportlast week that Apple contractors had been privy to all sorts of conversations, including couples having sex and people at doctors' appointments, as part of their work "grading" Siri's response handling.
- "While we conduct a thorough review, we are suspending Siri grading globally," Apple said in a statement to Axios, saying it is "committed to delivering a great Siri experience while protecting user privacy."
- Apple previously said less than 1% of queries were subject to such grading and that they were typically only a few seconds long. Also, it said the queries weren't tied to a particular Apple ID and that those listening were in secure facilities and subject to Apple's strict confidentiality rules.
Between the lines: Digital assistants are in their early days, so the tech giants want to find ways to both see how well they are doing and identify areas for improvement. However, digital assistants are often awakened accidentally, and as such, can end up being privy to sensitive conversations.
The bigger picture: Google ispausinga similar program for EU residents after a Germany data protection commissionerannounced an inquiryinto its practices.
Read More
No comments:
Post a Comment