MIT Technology Review Subscribe

Omek’s Gestural Interface Makes Perceptual Computing Human-Friendly

By studying human factors (shocking!), an Israeli company makes close-range gestural input make sense.

I recently reviewed Intel’s prototype “perceptual computing” interface, and while the product vision was compelling, the user experience needed a lot of work. Just because you can plop a depth camera on top of your laptop and wave your hands in front of the screen, does that mean you should? Luckily, an Israeli company called Omek Interactive has applied some actual thought and research toward answering this question. Their “Arc Menu” is the first close-range, consumer-grade gestural UI I’ve seen that takes comfort and basic ergonomics into account. Here’s a demo the company did at CES for LazyTechGuys:

Advertisement

The basic idea behind the Arc Menu is the kind of thing that sounds obvious only in retrospect: human beings tend to move their limbs in arcing motions, not vertical or horizontal lines. This means that gridlike interfaces, like the Windows 8 Start screen, feel subtly cumbersome and tiring in a gestural context. Omek did an enormous amount of user testing and iteration to arrive at this insight, but once they had it, the solution was clear: redesign the gestural menu into an arc that curves from the upper right part of the screen downwards. If you’re typing at your computer and raise your right hand to make a swiping movement, it will probably follow this arcing path through the air. 

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Consider, though, the other assumptions that Omek employs to make this design work. For one, their depth camera isn’t mounted on top of the computer screen like a webcam; it’s positioned between the screen and the keyboard, so that its “capture volume” hovers just above the keys, but below the user’s eyeline to the screen. As I said in my Intel review, what good are close-range gestures if they obscure your own view of what you’re doing?

Omek’s designers clearly understand this constraint–and what it implies about how close-range gestures might actually be useful. Quick, casual movements in between touching the screen or typing on the keyboard–not precise, complicated, Minority-Report-like “gestural commands”–are what makes sense for perceptual computing, at least for now. Further iterations will reveal even more subtleties about how real live humans want to interact with their technology with gestures. But Omek’s work is a welcome reminder that we have to focus on the humans first, not the technology, to make that kind of progress.

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement