Adding Temperature to Human-Computer Interaction
Touch interfaces and haptic feedback are already a part of how we interact with computers, in the form of iPads, rumbling video game controllers and even three-dimensional joysticks. As the range of interactions with digital environments expands, it’s logical to ask what’s next: Smell-o-vision has been on the horizon for something like 50 years, but there’s a dark horse stalking this race: thermoelectrics.
Based on the Peltier effect, these solid-state devices are easy to incorporate into objects of reasonable size, i.e. video game controllers.
In this configuration, just announced at the 2010 SIGGRAPH conference, a pair of thermoelectric surfaces on either side of a controller rapidly heat up or cool down in order to simulate appropriate conditions in a virtual environment.
The temperature difference isn’t large - less than 10 degrees heating or cooling after five seconds, but the researchers involved discovered that, as with haptics, just a little sensory nudge can be enough to convince involved participants in a virtual environment that they are experiencing something like the real thing.
This graph shows that users responded to the change in temperature in a second (for cooling) or after about two and a half seconds (for heating), a difference they attribute to the inherent difference in sensitivity to hot or cold of the human palm.
The research was conducted by researchers at Tokyo Metropolitan University, with collaboration from the National Institute of Special Needs Education. Not coincidentally, among their aims for the device, they list temperature-transmitting interfaces for the blind.
Follow Christopher Mims on Twitter, or contact him via email.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.