Alexa just got eyes.
The e-tailer’s latest addition to its cadre of smart assistants is called the Echo Look. Much like the Echos before it, you can ask it to play music, read news headlines, or recite your forthcoming schedule. But unlike its brethren, the $200 device also sports cameras that will snap photographs or videos of you when you ask it to. The idea: to act as a kind of smart mirror for the fashion conscious. It even allows you to use use depth perception to blur the background and, in Amazon’s words, “make sure your outfits pop.”
You can use it to see how you actually look from the side or behind, or maybe watch a short video clip to … er, see what your outfit looks like on the dance floor, we guess? There’s also the option to use Amazon’s Style Check feature—already available for Prime members in the Amazon app—which allows you to offer up two images of different outfits to Amazon’s fashion AI and get advice on which looks best.
How that will work if you’re a true fashionista pushing the envelope of style or a dullard whose largest clothing choice is between a pale blue or off-white button down? We’re not so sure. But what is certain is that the device is part of Amazon’s continued push into the fashion market—which some analysts reckon could make it America’s top clothing retailer this year.
The device is fun, and only $20 more expensive than the camera-free Echo speaker, so it will, like its siblings, undoubtedly prove popular. But it’s interesting to think about what the addition of eyes to Alexa’s sensory gamut will mean for Amazon. Beyond hacking concerns raised by placing a connected camera in the location where you dress—and, presumably, undress—there’s the bigger question of what the company itself does with your data.
We asked Amazon, and it confirmed that the images and video gathered by Echo Look will be stored on the company's cloud. That’s also where the AI processing of images will be performed, too. That’s much the same as the way audio is handled by every other device in the Alexa range, with recordings started and stored away to servers every time the assistant hears its name.
That’s great news for Amazon. As our own Tom Simonite has reported, the huge quantities of data supplied by people making voice commands to their Echo are enabling the firm to make impressive breakthroughs in what voice assistants can do. Adding a camera means Amazon will be able to collect huge troves of visual data that it can analyze to determine your tastes in color and style, make recommendations, and then learn from how you respond.
But there are more insights lurking inside those shots, too—about your home decor and whatever else happens to be in frame when you capture selfies. It’s now easy to identify objects in images, and doing so will be fair game for Amazon.
By this point, of course, if you're an Amazon Prime customer, the company has already learned a great deal about your preferences based solely on your shopping habits. To be sure, the Echo Look represents another level of giving up personal data. But maybe it's worth it if you look good doing it?
Yann LeCun has a bold new vision for the future of AI
One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
The hype around DeepMind’s new AI model misses what’s actually cool about it
Some worry that the chatter about these tools is doing the whole field a disservice.
AI’s progress isn’t the same as creating human intelligence in machines
Honorees from this year's 35 Innovators list are employing AI to find new molecules, fold proteins, and analyze massive amounts of medical data.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.