A View from Kristina Grifantini
Putting Virtual Controls on Your Arm
“Skinput” lets users control a computer by tapping buttons projected onto their body.
Researchers at Carnegie Mellon University and Microsoft have developed an acoustic biosensor that turns an arm into a crude touch screen.
An armband, worn around the bicep, detects minute sound waves that travel through skin when it is tapped. The researchers designed a software program that can distinguish the origin of the acoustic sounds–which vary due to slight differences in underlying bone density, mass and tissue. The system then translates these locations into button commands. A pico projector embedded in the armband projects a display–a game of Tetris or button controllers–onto a user’s palm or arm.
The researchers found that they were able to achieve 95.5% accuracy with the controllers when five points on the arm were designated as buttons. They will present their results at this year’s CHI conference next month.
See the researchers present Skinput below.
AI is here.
Own what happens next at EmTech Digital 2019.