On a Saturday this spring, four game designers packed a cooler filled with dry ice and several hundred tea-light-size ice pops, loaded it onto an airport security scanner in New York, and calmly explained to a TSA agent that the pops weren’t solely desserts. At the Game Developers Conference in San Francisco later that week, the frosty treats would be placed one by one into a 3-D-printed video game controller and used as icy buttons for players to operate with their tongues.
That’s right, their tongues. In Planet Licker, a single-player game originally created over 48 hours at the Ludum Dare game jam last August, players guide a pixelated monster by physically licking ice pops. The inch-and-a-half-tall pops sit in tiny metal cups that are wired to a Makey Makey circuit board in the game’s controller. When players hold the controller and touch their tongue to any of the three brightly colored pops in it, the action controls the on-screen monster. Licking a specific flavor—grapefruit-honey, pink lemonade, or beet, for instance—guides the monster to the nearest digitized ice planet corresponding with that pop’s color.
The game is admittedly silly and commercially impractical, but making money isn’t Planet Licker’s real goal, says Andy An, the industrial designer who built the tongue-operated controller. “We really were just interested in considering other senses in game development and exploring different foods to hack,” he says.
Developers and researchers alike have long searched for safe and practical ways to incorporate taste and smell into games, in an effort to create immersive experiences that appeal to all the senses. Planet Licker is the only game (that we know of) using edible ice buttons. Other technologies aimed at making games smellier and tastier include National University of Singapore’s “electronic lollipop,” which sends electrical signals to the tongue to give users the sensation of tasting something bitter, sweet, salty, or sour, and the FeelReal VR, a mask that fits over the nose and mouth and connects to a virtual-reality headset like an Oculus Rift or Sony Morpheus. FeelReal can deliver up to seven different scents, including flowers, meat, and burning rubber, and for $300, users can create their own customized smells.
These technologies have enormous potential for changing the way we play games—imagine savoring pixelated feasts or sniffing out stinky monsters before they attack—but incorporating taste and smell into games goes beyond just building better entertainment experiences, says Heather Kelley, an assistant teaching professor at Carnegie Mellon University who teaches sensory interaction design. There are also health and rehabilitation applications.
Virtual Afghanistan, a VR program that’s currently used to treat veterans suffering from combat-related post-traumatic stress disorder, already uses smell as well as visual, audio, and tactile cues to help patients learn to control their reactions to trauma. Meanwhile, in Sweden, researchers at Malmö University are using a smell-based game called Nosewise to see if stimulating a person’s olfactory senses (and the memories attached to certain scents) can slow down cognitive degeneration that happens with conditions like Alzheimer’s and dementia.
“If we only think of ‘games’ as software you play at home on a screen, you ignore a huge swath of the creative and commercial opportunity here,” Kelley says.
Delivering taste and smell is tough. There are several reasons why both senses have traditionally been left out of game design—hygiene and allergy considerations, for starters. To overcome those obstacles, the pops in Planet Licker pop up over the top of the controller, allowing players to lick without touching the controller itself, and the melty runoff collects in the pops’ metal cups, preventing spillage and germ-spreading.
Planet Licker is a good “first step” toward incorporating taste into game play, says Kelley, but it will still be a while before designers figure out how to use taste and smell to advance storytelling to the same extent as the visual or audio cues currently used.
Kelley’s own sensory design work focuses on smell. For her 2009 horse-themed game Sugar, she built an “action olofactorizer” that could open small vials of scented liquid, heat them, and waft them at players. When performing well, players would smell fresh-cut grass. When they performed poorly, their noses were greeted with the odor of horse manure, which Kelley made herself from the real thing.
The scents added to the game but didn’t “give you a revelation or any kind of information that you didn’t already have,” Kelley says. “How do we use smell in recalling a character or an event that happens in a game? How can we use smell to trigger a memory of something that happened in the fiction? That’s where we haven’t gone yet, and you could say similar things for taste.”
The gene-edited pig heart given to a dying patient was infected with a pig virus
The first transplant of a genetically-modified pig heart into a human may have ended prematurely because of a well-known—and avoidable—risk.
Meta has built a massive new language AI—and it’s giving it away for free
Facebook’s parent company is inviting researchers to pore over and pick apart the flaws in its version of GPT-3
Saudi Arabia plans to spend $1 billion a year discovering treatments to slow aging
The oil kingdom fears that its population is aging at an accelerated rate and hopes to test drugs to reverse the problem. First up might be the diabetes drug metformin.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.