Say goodbye to the glitchy self-checkout scanners in your local retail store. Grocery buying is about to get a big boost from artificial intelligence.
At a prototype store in Santa Clara, California, you grab a plastic basket, fill it up as you amble down an aisle packed with all kinds of things—Doritos, hand soap, Coke, and so on—then walk to a tablet computer near the door. The tablet shows a list of everything that’s in your basket and how much you owe; you pay, and you leave.
This store is actually the demonstration space of a startup called Standard Cognition, which is using a network of cameras and machine vision and deep-learning techniques to create an autonomous checkout experience.
Standard Cognition cofounder and chief operating officer Michael Suswal says the company hopes to have it in a store—either a partner’s or the company’s own—in six months, most likely in the Bay Area. And while the company’s tablet app currently instructs you to pay, they say a smartphone app for customers could add automatic payment.
If this sounds familiar, it’s because online retail juggernaut Amazon is running a private beta test in a store in Seattle for a very similar project, Amazon Go. A Stockholm-based startup called Wheelys is testing a similar store in China. This excites the Standard Cognition team, since it seems to validate the idea that this kind of checkout experience, while still just in the experimental stages and perhaps something that some people will find creepy, could one day be big business.
For now, the company is spending a lot of time refining its technology in its Santa Clara space, where one wall looks very much like a convenience store whose stock comes mostly from Costco. There are big bags of Cheetos and Doritos, giant jars of Skippy peanut butter, and non-food items like paper towel holders and plastic fans.
Standard Cognition uses its cameras to track individual people in real time as they move around the store (Suswal says the company is not doing any facial recognition), and spot the items they take off the shelves. The company trains its deep neural networks to recognize items in the store, too, in a process that takes about two minutes per item and consists of an employee grabbing the item and doing things like turning it over, putting it behind their back, and placing it in a basket in view of the cameras.
You can see a visual representation of this right when you walk into Standard Cognition’s faux store. Against one wall is a big flat-screen monitor, showing a live video feed of the space with a different colored marker denoting each person; whenever you pick something up, it gets a circle and a label on the screen. A video the company released shows the technology in action, as two people stand in the demo store grabbing items and generally trying to perplex the system.
I grabbed a shopping basket to try it out myself. The results, while a little rough, were still impressive. I wandered down the aisle, placing Nilla Wafers, bottles of Coke, and other items in my basket, then taking some out and leaving them behind. I quickly shoved a can of Red Bull up my shirt in hopes that the cameras would miss it, and loaded up on similar-looking items (a bag of Doritos and a bag of Cheetos, as well as two different kinds of Mrs. Meyer’s liquid hand soap).
When I was done, I walked over to a tablet that showed me a list of all the items Standard Cognition thought I had in my basket. It missed one of my two bottles of Coke and added an additional bottle of soap—things we could edit in the checkout app on the tablet. But the list was mostly correct, and, to my chagrin, it caught that Red Bull, too.
Brandon Ogle, another cofounder and an engineer for the company, says item classification in the demo store is currently correct 98 percent of the time. Standard Cognition is working on it, in part by teaching its computers to identify more products—Ogle says the more items the company has added, the more accurate it has gotten.
It may take a while for autonomous checkout to become the norm in most stores, though. Tom Davenport, a professor at Babson College and coauthor of Only Humans Need Apply: Winners and Losers in the Age of Smart Machines, thinks that we’ll increasingly see such experiences, but he’s skeptical about how quickly this will happen. After all, he says, self-checkout has been around for about two decades in the U.S. and still hasn’t revolutionized the checkout process here.
“I think nobody these days would suggest that becoming a supermarket point-of-sale clerk is a growth industry,” he says. “But they’ve proven remarkably resistant to going away.”
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
Surgeons have successfully tested a pig’s kidney in a human patient
The test, in a brain-dead patient, was very short but represents a milestone in the long quest to use animal organs in human transplants.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
The covid tech that is intimately tied to China’s surveillance state
Heat-sensing cameras and face recognition systems may help fight covid-19—but they also make us complicit in the high-tech oppression of Uyghurs.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.