A Team At Amazon Is Listening To Recordings Captured By Alexa

An Amazon spokesperson said that "an extremely small sample of Alexa voice recordings" is annotated.

A team at Amazon that includes both full-time employees and contractors listens to people's audio snippets recorded by devices with the company's Alexa assistant installed, according to a Bloomberg report.

Seven people, described as having worked in Amazon's voice review program, told Bloomberg that they sometimes listen to as many as 1,000 recordings per shift, and that the recordings are associated with the customer's first name, their device's serial number, and an account number. Among other clips, these employees and contractors said they've reviewed recordings of what seemed to be a woman singing in the shower, a child screaming, and a sexual assault. Sometimes, when recordings were difficult to understand — or when they were amusing — team members shared them in an internal chat room, according to Bloomberg.

In an emailed statement to BuzzFeed News, an Amazon spokesperson wrote that "an extremely small sample of Alexa voice recordings" is annotated, and reviewing the audio "helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone."

Additionally, the spokesperson said, "All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it."

Amazon's privacy policy says that Alexa's software provides a variety of data to the company (including your use of Alexa, your Alexa Interactions, and other Alexa-enabled products), but doesn't explicitly state how employees themselves interact with the data.

Apple and Google, which make two other popular voice-enabled assistants, also employ humans who review audio commands spoken to their devices; both companies say that they anonymize the recordings and don't associate them with customers' accounts. Apple's Siri sends a limited subset of encrypted, anonymous recordings to graders, who label the quality of Siri's responses. The process is outlined on page 69 of the company's security white paper. Google also saves and reviews anonymized audio snippets captured by Google Home or Assistant, and distorts the audio.

On an FAQ page, Amazon states that Alexa is not recording all your conversations. Amazon's Echo smart speakers and the dozens of other Alexa-enabled devices are designed to capture and process audio, but only when a "wake word" — such as "Alexa," "Amazon," "Computer," or "Echo" — is uttered. However, Alexa devices do occasionally capture audio inadvertently and send that audio to Amazon servers or respond to it with triggered actions. In May 2018, an Echo unintentionally sent audio recordings of a woman's private conversation to one of her husband's employees.

In any case, this is all a good reminder that if you'd rather keep something private, it's not a bad idea to unplug your smart speaker.

Topics in this article

Skip to footer