Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

For once, I control the weather.

I’m standing in front of a green backdrop inside a windowless studio at Cybernet Systems, a technology research and development company in Ann Arbor, MI. A digital camera in front of me is beaming my image, real time, to a television monitor that shows a scene typical of a nightly news weather report. There I am, standing before a map of the Midwest. I extend my arm and begin twirling my hand over the blip of Detroit. The map behind me zooms in on the area beneath my palm. The city widens into view and comes into focus. Looks like it’s going to be a wet one, folks.

This is GestureStorm-a software system Cybernet developed to let weather broadcasters run through their forecasts with simple flicks of the hand. No wires. No buttons. No geeky audiovisual control panels. Move a hand one way, and you paint raindrops on-screen. Move it another, and you stir up a tornado. The interface is completely a matter of gesture. And if a lot of people have their way, this is only the beginning. Gesture recognition technology aims to become this millennium’s remote control-a fluid, freeing means of interacting with all the digital stuff around us. Think Minority Report. In that film, Tom Cruise stands before a futuristic digital display, pointing and waving his way through a cascade of images and documents. This stuff, once the domain of science fiction, is finally creeping into the real world.

In Orlando, FL, WKMG became the first television station to use GestureStorm when it unveiled the system in December. In July 2003, Sony Computer Entertainment released the EyeToy, a PlayStation 2 peripheral that, using special software and an inexpensive digital camera, can project a video feed of a player into a game, even responding to the player’s movements; instead of zapping a bad guy with a controller button, the gamer gives him a swift karate chop. This year, two companies will debut virtual keyboards that let people control personal digital assistants and even automotive equipment with gestures. As far as Charles Cohen, vice president for research and development at Cybernet, is concerned, gesture recognition’s time has come. “Gesture recognition is remote control with a wave of a hand,” he says.

As I unleash some storm clouds over Detroit, I see what he means. Of course, playing weatherman is one thing, but importing gesture recognition into daily life is another, as Cohen and the others pioneering the technology are learning. “I don’t know what the killer app for gesture recognition is yet,” Cohen confesses.

Pages

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »