Tired of constantly pressing the wrong buttons on a too-sensitive, tiny touch screen? Researchers at the Ishikawa Komuro Laboratory at the University of Tokyo have created a camera system that attaches to a mobile device to let it track mid-air finger movements and translate those movements into commands.
The camera recognizes if the finger is moving toward it and away from it and at what speed. This lets a user move a mouse, zoom and scroll pictures, digitally draw and type, without ever touching the screen.
Phones that recognize gestures could help users avoid fumbling around on touch screens, or alleviate the physical caused by from typing. Microsoft’s Project Natal will use a similar, full-body motion-tracking interface for gaming.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
Inside Charm Industrial’s big bet on corn stalks for carbon removal
The startup used plant matter and bio-oil to sequester thousands of tons of carbon. The question now is how reliable, scalable, and economical this approach will prove.
The hype around DeepMind’s new AI model misses what’s actually cool about it
Some worry that the chatter about these tools is doing the whole field a disservice.
How Charm Industrial hopes to use crops to cut steel emissions
The startup believes its bio-oil, once converted into syngas, could help clean up the dirtiest industrial sector.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.