Apple apologizes over Siri privacy and will no longer retain audio recordings
The announcement follows criticism of the iPhone maker and other technology giants for employing humans to listen to recordings of user interactions with voice assistants in a bid to improve the product.
Apple Inc. apologized for privacy mishaps surrounding its Siri voice assistant and said that it would no longer retain audio recordings of Siri interactions.
The announcement follows criticism of the iPhone maker and other technology giants for employing humans to listen to recordings of user interactions with voice assistants in a bid to improve the product. Apple had hundreds of contractors listening to Siri in a process called "grading," and the company suspended the program a few weeks ago in the wake of the controversy. It plans to reinstate the grading practice after making a few changes in software updates this fall.
"As a result of our review, we realize we haven't been fully living up to our high ideals, and for that we apologize," Apple said in a statement.
The company said Wednesday that users will be able to opt in to allow Apple to listen to a select bunch of anonymized audio samples in order to improve Siri, and then be able to opt out of the program later if they wish. Previously, less than 0.2% of Siri commands were analyzed. While it will no longer store audio recordings, computer-generated transcriptions will be held anonymously for up to six months, Apple said.
In another change, Apple said only its own employees would listen to audio samples, rather than outside contractors. It also said it is making changes to the review process to lessen the data about customers that reviewers can see.
Following concerns from users that Apple could be retaining recordings from Siri that were accidentally picked up due to a mistaken button press or the system thinking the user had said "Hey Siri," Apple said Wednesday it would work to delete inadvertent recordings.
Bloomberg News reported earlier this year that Apple had a team grading Siri commands to gauge its performance and fix issues. Apple stopped the process earlier this month and Amazon said it will let Alexa users opt out of human review. Google made similar concessions. Apple faces a class-action lawsuit over privacy violations.
The use of human reviewers by Apple, Google and Amazon has spurred examinations by lawmakers and regulators in the U.S. and Europe. Privacy advocates have voiced concern that the companies' practices could violate users' rights, particularly in cases where devices begin recording unintentionally or without the user's knowledge.