Putting Virtual Controls on Your Arm
Researchers at Carnegie Mellon University and Microsoft have developed an acoustic biosensor that turns an arm into a crude touch screen.
An armband, worn around the bicep, detects minute sound waves that travel through skin when it is tapped. The researchers designed a software program that can distinguish the origin of the acoustic sounds–which vary due to slight differences in underlying bone density, mass and tissue. The system then translates these locations into button commands. A pico projector embedded in the armband projects a display–a game of Tetris or button controllers–onto a user’s palm or arm.
The researchers found that they were able to achieve 95.5% accuracy with the controllers when five points on the arm were designated as buttons. They will present their results at this year’s CHI conference next month.
See the researchers present Skinput below.
Keep Reading
Most Popular
Large language models can do jaw-dropping things. But nobody knows exactly why.
And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.
How scientists traced a mysterious covid case back to six toilets
When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.
The problem with plug-in hybrids? Their drivers.
Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.