Putting Virtual Controls on Your Arm
Researchers at Carnegie Mellon University and Microsoft have developed an acoustic biosensor that turns an arm into a crude touch screen.
An armband, worn around the bicep, detects minute sound waves that travel through skin when it is tapped. The researchers designed a software program that can distinguish the origin of the acoustic sounds–which vary due to slight differences in underlying bone density, mass and tissue. The system then translates these locations into button commands. A pico projector embedded in the armband projects a display–a game of Tetris or button controllers–onto a user’s palm or arm.
The researchers found that they were able to achieve 95.5% accuracy with the controllers when five points on the arm were designated as buttons. They will present their results at this year’s CHI conference next month.
See the researchers present Skinput below.
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
ChatGPT is about to revolutionize the economy. We need to decide what that looks like.
New large language models will transform many jobs. Whether they will lead to widespread prosperity or not is up to us.
GPT-4 is bigger and better than ChatGPT—but OpenAI won’t say why
We got a first look at the much-anticipated big new language model from OpenAI. But this time how it works is even more deeply under wraps.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.