Skip to Content

A Techno-Sensory Revolution is Coming, According to IBM

The five human senses? We’ll have have technologies that stimulate each of them in new ways.
December 18, 2012

In case you hadn’t noticed, it is list season. Gift lists and card lists, New Years’ resolution lists and, of course, Best Of 2012 lists.

IBM has its own twist on this tradition. It has published a list of tech advances that it’s researchers think will change our lives in the next five years. In a new “5 in 5” report published this week, they describe ways in which technology will able to enhance, augment, or mimic (to varying degrees) our senses of sight, sound, touch, smell, and taste. IBM is building a lot of that tech in-house, but others are developing their own technology that could contribute to that change. 

Take sight. The promise of Google Glass and lookalikes like this competitor Vuzix, or even Google Goggles, indicate how computers are learning to “see” better. IBM’s vision report says image processing will get faster and better learning to recognize human scenes like a beach, with a volley ball game or a surfing contest.

When it comes to hearing, Apple’s Siri and her legion of competitors are the best example of everyday tech trying to sound out our needs, with varying degrees of success. Perhaps the thing folks remember most about Google Now is how its voice search magically, works. But forget adults, IBM researchers are working on away to tell what a baby is feeling by the sounds they make, even patenting a way to track that data.

But perhaps most interesting of the lot is their section on touch-based technology. Screens in the next five years will take on a whole other range of abilities, IBM predicts

We at IBM Research think that in the next five years that our mobile devices will bring together virtual and real world experiences to not just shop, but feel the surface of produce, and get feedback on data such as freshness or quality.

By matching variable-frequency patterns of vibration to physical objects so, that when a shopper touches what the webpage says is a silk shirt, the screen will emit vibrations that match what our skin mentally translates to the feel of silk.

They have a point—Disney’s been working on a project called TeslaTouch for some time now. They’ve built a screen which tickles the nerves in your fingers as you drag a digit across its cold surface. Varying electric field patterns on the touch panel administer the sensation of touch. Finger painting on a screen could have all the sensation of finger painting on a canvas—with none of the mess. When you bought a dress or shirt online you could paw at the virtual fabric before you bought it.

Projects with similar goals are underway at the Linear Actuators lab at the Ecole Polytechnique Federale de Lausanne in Switzerland. In a video, graduate student Christophe Winter explains that you can change a person’s understanding of the material they are touching—that is, the friction they feel—by changing the vibration of the surface they are touching.

But though our tactile feedback from screens will be enriched, I think there are also ways we’ll be touching our devices less on the whole. Consider what the Kinect did for gaming, and the range of other uses the motion sensor technology sensor is being tested and developed for. As MIT Technology Review wrote earlier this year, there’s a fair chance we’ll be touching screens less, and gesturing at them more.

Keep Reading

Most Popular

Geoffrey Hinton tells us why he’s now scared of the tech he helped build

“I have suddenly switched my views on whether these things are going to be more intelligent than us.”

ChatGPT is going to change education, not destroy it

The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.

Meet the people who use Notion to plan their whole lives

The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.

Learning to code isn’t enough

Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.