Amazon Echo users will no longer have option to process Alexa voice recordings on-device | Technology News

Days after unveiling a revamped Alexa with generative AI features, Amazon has reportedly said it will no longer process users’ commands and voice recordings locally for Echo devices such as speakers and smart displays.
Instead, the voice recordings will be sent and processed on the tech giant’s cloud. The change in policy will come into effect from March 28 onwards, according to a report by ArsTechnica.
“As we continue to expand Alexa’s capabilities with generative AI features that rely on the processing power of Amazon’s secure cloud, we have decided to no longer support this [on-device processing] feature,” the Jeff Bezos-founded company reportedly said in an email to Echo customers.
Only users who have enabled the ‘Do Not Send Voice Recordings’ option on their Echo devices received the email, as per the report.
“Alexa voice requests are always encrypted in transit to Amazon’s secure cloud, which was designed with layers of security protections to keep customer information safe. Customers can continue to choose from a robust set of controls by visiting the Alexa Privacy dashboard online or navigating to More > Alexa Privacy in the Alexa app,” it added.
Amazon further said it will automatically delete recordings of users’ Alexa requests after processing them.
The updated policy poses potential privacy concerns as users might not want Amazon to be able to access the personal requests made in their home via their Echo speakers or smart displays.
Story continues below this ad
“The Alexa experience is designed to protect our customers’ privacy and keep their data secure, and that’s not changing. We’re focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon’s secure cloud. Customers can continue to choose from a robust set of tools and controls, including the option to not save their voice recordings at all. We’ll continue learning from customer feedback and building privacy features on their behalf,” a company spokesperson was quoted as saying by The Verge.
The subscription-based, generative AI version of Alexa called Alexa+ is set to launch in the coming weeks. It has several new capabilities, including a feature known as Alexa Voice ID that lets the AI assistant recognise who is speaking to it. Voice ID also enables Alexa+ to share user-specified calendar events, reminders, music, and more.
However, Amazon has previously said that VoiceID may not work if users opt not to save any voice recordings.
History of alleged privacy violations
Amazon does not have a perfect track record of protecting user privacy. The company’s employees were allowed to listen to Alexa voice recordings to train its speech recognition and natural language understanding systems. These staffers listened to as many as 1,000 audio samples during their nine-hour shifts, according to a 2019 Bloomberg report.
Story continues below this ad
Users had also accused Amazon of not properly informing them that their Alexa voice recordings could be stored by the company unless it was specifically prompted not to do so.
In 2023, Amazon paid $25 million in civil penalties after a privacy lawsuit alleged that it stored children’s interactions with Alexa forever. That same year, Amazon’s Ring doorbell unit paid $5.8 million and settled with the US Federal Trade Commission (FTC) over privacy violations.
The FTC had alleged that thousands of Amazon employees and contractors were allowed to watch video recordings of customers’ private spaces captured through its Ring cameras.