Tired of constantly pressing the wrong buttons on a too-sensitive, tiny touch screen? Researchers at the Ishikawa Komuro Laboratory at the University of Tokyo have created a camera system that attaches to a mobile device to let it track mid-air finger movements and translate those movements into commands.
The camera recognizes if the finger is moving toward it and away from it and at what speed. This lets a user move a mouse, zoom and scroll pictures, digitally draw and type, without ever touching the screen.
Phones that recognize gestures could help users avoid fumbling around on touch screens, or alleviate the physical caused by from typing. Microsoft’s Project Natal will use a similar, full-body motion-tracking interface for gaming.
When designing an embedded system choosing which tools to use often comes down to building a custom solution or buying off-the-shelf tools.