Menu Close

Amazon Echo’s privacy issues go way beyond voice recordings

Amazon Echo and the Alexa voice assistant have had widely publicised issues with privacy. Whether it is the amount of data they collect or the fact that they reportedly pay employees and, at times, external contractors from all over the world to listen to recordings to improve accuracy, the potential is there for sensitive personal information to be leaked through these devices.

But the risks extend not just to our relationship with Amazon. Major privacy concerns are starting to emerge in the way Alexa devices interact with other services – risking a dystopian spiral of increasing surveillance and control.

The setup of the Echo turns Amazon into an extra gateway that every online interaction has to pass through, collecting data on each one. Alexa knows what you are searching for, listening to or sending in your messages. Some smartphones do this already, particularly those made by Google and Apple who control the hardware, software and cloud services.

But the difference with an Echo is that it brings together the worst aspects of smartphones and smart homes. It is not a personal device but integrated into the home environment, always waiting to listen in. Alexa even features an art project (not created by Amazon) that tries to make light of this with the creepy “Ask the Listeners” function that makes comments about just how much the device is spying on you. Some Echo devices already have cameras, and if facial recognition capabilities were added we could enter a world of pervasive monitoring in our most private spaces, even tracked as we move between locations.

This technology gives Amazon a huge amount of control over your data, which has long been the aim of most of the tech giants. While Apple and Google – who face their own privacy issues – have similar voice assistants, they have at least made progress running the software directly on their devices so they won’t need to transfer recordings of your voice commands to their servers. Amazon doesn’t appear to be trying to do the same.

This is, in part, because of the firm’s aggressive business model. Amazon’s systems appear not just designed to collect as much data as they can but also to create ways of sharing it. So the potential issues run much deeper than Alexa listening in on private moments.

Sharing with law enforcement

One area of concern is the potential for putting the ears of law enforcement in our homes, schools and workplaces. Apple has a history of resisting FBI requests for user data, and Twitter is relatively transparent about reporting on how it responds to requests from governments.

But Ring, the internet-connected home-security camera company owned by Amazon, has a high-profile relationship with police that involves handing over user data. Even the way citizens and police communicate is increasingly monitored and controlled by Amazon.

Always listening. Tomasso79/Shutterstock

This risks embedding a culture of state surveillance in Amazon’s operations, which could have worrying consequences. We’ve seen numerous examples of law enforcement and other government bodies in democratic countries using personal data to spy on people, both in breach of the law and within it but for reasons that go far beyond the prevention of terrorism. This kind of mass surveillance also creates severe potential for discrimination, as it has been shown repeatedly to have a worse impact on women and minority groups.

If Amazon isn’t willing to push back, it’s not hard to imagine Alexa recordings being handed over to the requests of government employees and law enforcement officers who might be willing to violate the spirit or letter of the law. And given international intelligence-sharing agreements, even if you trust your own government, do you trust others?

In response to this issue, an Amazon spokesperson said: “Amazon does not disclose customer information in response to government demands unless we’re required to do so to comply with a legally valid and blinding order. Amazon objects to overbroad or otherwise inappropriate demands as a matter of course.

"Ring customers decide whether to share footage in response to asks from local police investigating cases. Local police are not able to see any information related to which Ring users received a request and whether they declined to share or opt out of future requests.” They added that although local police can access Ring’s Neighbors app for reporting criminal and suspicious activity, they cannot see or access user account information.

Tracking health issues

Health is another area where Amazon appears to be attempting a takeover. The UK’s National Health Service (NHS) has signed a deal for medical advice to be provided via the Echo. At face value, this simply extends ways of accessing publicly available information like the NHS website or phone line 111 – no official patient data is being shared.

But it creates the possibility that Amazon could start tracking what health information we ask for through Alexa, effectively building profiles of users’ medical histories. This could be linked to online shopping suggestions, third-party ads for costly therapies, or even ads that are potentially traumatic (think women who’ve suffered miscarriages being shown baby products).

An Amazon spokesperson said: “Amazon does not build customer health profiles based on interactions with nhs.uk content or use such requests for marketing purposes. Alexa does not have access to any personal or private information from the NHS.”

The crudeness and glitches of algorithmic advertising would violate the professional and moral standards that health services strive to maintain. Plus it would be highly invasive to treat the data in the same way many Echo recordings are. Would you want a random external contractor to know you were asking for sexual health advice?

Transparency

Underlying these issues is a lack of real transparency. Amazon is disturbingly quiet, evasive and reluctant to act when it comes to tackling the privacy implications of their practices, many of which are buried deep within their terms and conditions or hard-to-find settings. Even tech-savvy users don’t necessarily know the full extent of the privacy risks, and when privacy features are added, they often only make users aware after researchers or the press raise the issue. It is entirely unfair to place such a burden on users to find out and mitigate what these risks are.

So if you have an Echo in your home, what should you do? There are many tips available on how to make the device more private, such as setting voice recordings to automatically delete or limiting what data is shared with third parties. But smart tech is almost always surveillance tech, and the best piece of advice is not to bring one into your home.

In response to the main points of this article, an Amazon spokesperson told The Conversation:

At Amazon, customer trust is at the centre of everything we do and we take privacy and security very seriously. We have always believed that privacy has to be foundational and built in to every piece of hardware, software, and service that we create. From the beginning, we’ve put customers in control and always look for ways to make it even easier for customers to have transparency and control over their Alexa experience. We’ve introduced several privacy improvements including the option to have voice recordings automatically deleted after three or 18 months on an ongoing basis, the ability to ask Alexa to “delete what I just said” and “delete what I said today,” and the Alexa Privacy Hub, a resource available globally that is dedicated to helping customers learn more about our approach to privacy and the controls they have. We’ll continue to invent more privacy features on behalf of customers.

This article has been amended to make clear the “Ask the Listeners” function is an art project created by a third party.

Want to write?

Write an article and join a growing community of more than 180,400 academics and researchers from 4,911 institutions.

Register now