When Ryder Ziola places a bell pepper on the kitchen work surface in front of him, the tabletop springs to life, suggesting recipes and other information. He can also use the work surface like a touch screen, selecting options with a finger–to see, for example, what ingredients might go well with his pepper. Ziola, a graduate student at the University of Washington, developed the system, dubbed Oasis, with researchers at Intel Labs Seattle led by senior scientist Beverly Harrison. Ziola is demonstrating Oasis at the ninth annual Intel Research Day, held at the Computer History Museum in Mountain View, CA.
“If you put, for example, a steak on the surface, it will recognize the steak and come up with recipe,” says Ziola. “It may also come up with nutritional information.” The camera can also track the motion of a person’s hand, and discern when he is touching the surface or not, allowing the surface to be interactive.
A touch with a finger can bring up a timer, or summon up images or video to offer guidance on a particular step in the recipe. When two ingredients are placed on the surface together, Oasis suggests recipes that combine them. Any of the information displayed on the surface can be dismissed by sweeping a hand across the projected images.
Oasis uses a palm-sized “pico-projector” made by Microvision to project images onto the surface. The positioning and recognition of objects is worked out using a depth-perceiving camera made by PrimeSense, the company that supplies sensors to Microsoft’s Kinect Xbox gestural controller. Although the camera can be used to recognize objects using their 3-D shape, recognition currently involves only color information. “Being able to sense depth can make recognition easier and more robust,” says Ziola, who adds that this feature will eventually be added to the system.
Oasis can rapidly be trained to recognize new objects. When presented with a pack of gum, it took only a few clicks of a mouse to inform the system this was a new object to track. “It really just needs a snapshot of it,” says Ziola.
“Because the user interaction takes place in real time, performance is important,” says Harrison, “we’ve managed to get this working on an ordinary laptop, but it could also be handled by another device in the home like a set top box.” Intel recently announced it would be providing its one-gigahertz Atom processor for Google’s TV set top boxes, giving this platform enough power to take on such extra work.
“This can really apply to any countertop,” says Harrison, “it could be your coffee table, or a bathroom surface recognizing the pills you need to take.” Interactive tabletops of various kinds have been demonstrated before, “but now the projectors are so small you can actually think about sprinkling 10 of these around your house,” says Harrison.
The kitchen is a good place for such a system, says Gene Becker, whose Lightening Laboratories consultancy specializes in ubiquitous computing and augmented reality. “The kitchen is a place that is very information-rich.” He points out that ingredients can be linked with nutritional information or potential recipes.
“The kitchen is also an environment that does not tend to be high tech, though,” Becker adds. As for the prospect of pico-projectors becoming a common feature in the home, Becker suggests that they will become ubiquitous on cell phones first. “I think embedding into the home is one of the later places they’ll appear,” he says, explaining that they’ll be embraced more easily as part of another gadget rather than as a new, specialized one. “Retrofitting projectors is something only the technically minded will do, and it took a long time for even networking infrastructure to be standard in new homes.”