Qeexo’s Touch Screen Tech Can Distinguish Fingernails from Knuckles
Increasing the ways users can prod touch screens could open up new features on mobile devices.
As touch screens become more widespread, innovations in how we interact with them promise to make them more useful.
As smartphones skyrocketed in popularity, we got used to using our fingertips to navigate their touch screens with pokes, taps, swipes, and two-finger pinches. There’s more to our fingers than just the tips, though, and a startup called Qeexo aims to take advantage of this with technology that can differentiate between fingertips, nails, and knuckles.
The San Jose, California-based company’s technology, called FingerSense, can be used to do things like bring up a menu of options (akin to right-clicking on a mouse) on an e-mail with the knock of a knuckle, or enable new kinds of controls in games. Currently in talks with phone manufacturers, Qeexo hopes to have FingerSense installed in smartphones within a year.
Qeexo was spun out of Carnegie Mellon University, where it began several years ago as a research project of computer-human interaction graduate students Chris Harrison and Julia Schwarz. Harrison has already gained notice for his work in touch-screen technology—at Disney Research, he helped develop Touché, which can make nearly anything a computer-input device (see “Innovators Under 35: Chris Harrison, 28”).
Initially, the idea that morphed into Qeexo was meant to be used for detecting different types of styluses used on a large touch screen—a surface built by modifying a large, backlit Ikea table.
The project got them thinking about how such technology could be used to differentiate between different parts of the finger on a smartphone screen. Poking and swiping have been common on these devices since Apple released the first iPhone in 2007, but certain functions that should be easily pulled up across multiple apps—such as copying and pasting—have remained clumsy, so it seemed ripe for innovation.
“We don’t go around the world poking at it,” Harrison says. “I couldn’t even get out of this room if all I could do was poke.”
They shrunk down and refined their technology, packing it in a prototype they made by modifying a Samsung smartphone.
Qeexo’s technology relies in part on an acoustic sensor that can capture the sounds—mechanical vibrations—made by different types of on-screen touches from the different parts of a finger. Software on the phone, which has been trained with multiple people to tell the difference between various touches, uses information gathered by the acoustic sensor, along with data like where the touch occurred and how big it was, to make an educated guess about how a person is touching the screen.
I met up with Harrison and Schwarz in San Jose, where they showed me demos of several apps built to showcase their technology on the aforementioned Samsung smartphone. A drawing app allows you to draw with the tip of a handheld stylus, erase with its opposite rubber end, smudge the picture with your finger, and knock twice with your knuckle to bring up a new page. A photo-viewing app allows the user to knuckle-tap on an image to pull up a menu with options to do things like print, e-mail, or delete that photo.
Their version of the popular game Fruit Ninja requires the user to knock with a knuckle to crack a coconut, squish berries with a fingertip, slice bananas with a nail, or tap the screen with two knuckles simultaneously to clear a bunch of fruit at once. It was tricky to get the knack of all the different touches—I’m a Fruit Ninja whiz, but even I had a hard time with the modified game. Still, the technology worked quite well—for the most part, it accurately and speedily identified each touch.
But it may be a while before such technology is common.
Creative Strategies analyst Ben Bajarin says people are still getting used to using touch screens and multi-touch functions. He also points out that a number of parties will have to participate for something like FingerSense to become popular—not just handset makers and Qeexo itself, but software developers, too.
“It just takes time for ecosystems like that to develop and get on board. I’d say a few years at a minimum, but we’re probably looking at longer than that,” he says.
Learn from the humans leading the way in intelligent machines at EmTech Next. Register Today!
June 11-12, 2019