I almost ate a foam doughnut the other day because a virtual-reality experiment had me convinced it was real.
I was in the Virtual Human Interaction Lab at Stanford University, looking at a doughnut with chocolate frosting and rainbow sprinkles through the lenses of a virtual-reality headset, while holding in my hand something that felt very much like the same tasty treat and sniffing an unmistakable chocolatey, doughnut-y smell.
Fortunately, I hesitated. But sight, smell, and touch got my mouth watering.
That reaction is good news for Benjamin Li, a postdoctoral research fellow at Stanford who’s researching virtual reality’s influence on our perceptions of food and investigating how smell and touch could be added to VR.
Though consumer virtual reality is still in its earliest days, VR’s utility for influencing perception has been studied for years, and combining VR and scent has been explored, too, by both academics and companies. Li, who’s working with working with Jeremy Bailenson, the founding director of the Virtual Human Interaction Lab, thinks the combination of smell, touch, and VR could be used in a bunch of different ways in the future—some more dystopian-sounding than others.
Imagine a world where, say, salmon has become extinct. Maybe you could use a virtual piece of salmon sushi, a salmon-like smell, and a real chunk of some other fish in the middle of a hand roll to give people who’ve never tried it a sense for what it’s like to eat salmon sushi. Or perhaps using scent along with virtual reality could help you eat a healthier diet without feeling that you’re missing out. You might see and smell a juicy cheeseburger while actually chomping on a plant-based patty.
When it comes to food in VR, seeing is not enough to make someone who’s eating one thing believe it’s really another, Li says. To prove that, he’s conducting experiments with people to see how they react when they feel a doughnut in their hands, see one on their VR headsets, and get a whiff from a swab doused in a chocolate scent.
After the fake doughnut is revealed, Li offers participants real ones to eat. This is more than a reward—the researchers want to figure out if people who see, smell, and touch the doughnut have a greater appetite for real doughnuts than those who don’t.
Li isn’t ready to draw conclusions from the data. But he says that anecdotally, once people see, smell, and touch the doughnut virtually they start craving one—not surprising, considering how I felt after my experience in the lab.
I asked Li if he’s experimenting with combining any other food odors with virtual reality, and he said it’s possible; he’s also tracked down popcorn and bacon scents. But what he’s more interested in now is looking at the importance of the food’s immediate environment—like a doughnut shop, for instance, rather than the sparsely furnished room we’re in. One of his next steps, he says, may be building a virtual storefront.
“Imagine if I put you in a doughnut shop and there’s a crowd bustling and you smell the doughnuts in the shop. Perhaps this would make a greater impact on you than being in the lab,” he says.
The big new idea for making self-driving cars that can go anywhere
The mainstream approach to driverless cars is slow and difficult. These startups think going all-in on AI will get there faster.
Inside Charm Industrial’s big bet on corn stalks for carbon removal
The startup used plant matter and bio-oil to sequester thousands of tons of carbon. The question now is how reliable, scalable, and economical this approach will prove.
The hype around DeepMind’s new AI model misses what’s actually cool about it
Some worry that the chatter about these tools is doing the whole field a disservice.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.