Skip to Content

Should an Amazon Echo Help Solve a Murder?

Authorities have asked Amazon to turn over data from a suspect’s Echo, raising thorny privacy questions.
December 27, 2016

It was only a matter of time. In what appears to be a milestone in the Internet of things era, police have asked Amazon for data that may have been recorded on its Echo device while a murder was taking place.

As the Information reports (paywall), a man named Victor Collins died sometime during the night of November 21, 2015, while visiting James Andrew Bates, a friend from work, at his home in Bentonville, Arkansas. Collins’s body was discovered in a hot tub the next morning, and Bates was charged with first-degree murder.

Bates had several smart devices in his home, the Echo among them. The device typically sits in an idle state with its microphones listening for key words like “Alexa” before it begins recording and sending data to Amazon’s servers. But as the Information points out, it’s not unusual for the Echo to wake up by mistake and grab snippets of audio that people may not have known was being recorded.

Investigators are clearly trying to be thorough, looking for any information that will shed light on what happened that night (for one thing, Bates’s smart water meter indicates he used 140 gallons of water between 1 a.m. and 3 a.m. that day—the prosecution claims that shows Bates was hosing down blood after he killed Collins).

But it raises a thorny question—or rather, a series of them: what is Amazon’s responsibility here? The company has so far denied the authorities’ requests, but should that be allowed? Or should investigators trying to get to the bottom of a potential murder be entitled to the data, even though it was recorded on Bates’s Echo in the privacy of his own home?

A similar problem reared its head earlier this year when Apple dug its heels in against the FBI’s request to unlock the iPhone that belonged to Syed Farook, one of the San Bernardino shooters. As Stanford University’s Woodrow Hartzog wrote for us, it was already clear that the murky legal waters Apple and the FBI found themselves in would soon extend to Internet of things devices:

Consider assistance technologies like the Amazon Echo, which are designed to “always listen” for words like “Hello, Echo” but do not fully process, store, or transmit what they hear until they are activated. For law enforcement purposes, most of the information the devices listen to is functionally impossible to recover. Does this mean legal authorities should consider Echo a warrant-proof technology? The emergence of the Internet of things is shrinking the number of “dumb” objects by the day. The government has requested laws that mandate data retention for over 10 years. Must all technologies be built to ensure that what they hear is retained and made available for law enforcement’s inspection?

Hartzog argued that authorities shouldn’t be able to force tech companies to hold on to every bit of data a user creates. Allowing some information, like voice data, to vanish is not necessarily a bad thing.

Of course, the other side of that argument is what authorities in the Collins murder case are contending: there may very well be data on Amazon’s servers that can help bring a criminal to justice. If so, investigators should get access to it.

Apple fought the FBI to a stalemate over a dead terrorist’s iPhone, with no satisfying resolution. As predicted, the case in Arkansas has now brought the conflict into the realm of the Internet of things. The more we use such technologies, the more often these issues are going to come up—and the more complicated they will become until companies and legislators get together and agree on a clear way forward.

(Read more: The Information (paywall), “The Feds Are Wrong to Warn of ‘Warrant-Proof’ Phones,” “Amazon Working on Making Alexa Recognize Your Emotions,” “What if Apple Is Wrong?”)

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.