Amazon is now allowing people who use Alexa to opt-out of human review of their voice recordings, Bloomberg has reported. This comes after a researcher revealed that some of Google’s Assistant recordings had been listened to by human contractors, and people started to become concerned about what other voice activated assistants do with recorded speech.
A new policy took effect Friday that allows customers, through an option in the settings menu of the Alexa smartphone app, to remove their recordings from a pool that could be analyzed by Amazon employees and contract workers, a spokesman for the Seattle company said. It follows similar moves by Apple, Inc., and Google.
According to Bloomberg, Amazon’s decision to let Alexa users opt-out of human review of their recordings follows criticism that the program violated customers’ privacy. Amazon says the Alexa app will now include a disclaimer in the settings menu that acknowledges that people might review recordings through Alexa. Bloomberg explains how to disable that and opt-out of human review.
The Guardian reported that Apple has suspended its practice of having human contractors listen to users’ Siri recordings to “grade” them. That decision came after a Guardian report that revealed that Apple’s contractors “regularly” hear confidential and private information while carrying out the grading process. The bulk of the confidential information was recorded through accidental triggers of the Siri assistant.
Google posted on The Keyword that it has provided tools for users to manage and control the data in their Google account. You can turn off storing audio data to your Google account completely, or choose to auto-delete data after every 3 months or 18 months.