Google Wants You to Control Your Gadgets with Finger Gestures, Conductive Clothing
New Google technology addresses the tiny screen problem by letting you control wearables with tiny gestures, or by touching your clothes.
Making small, wearable devices easier to control could make them much more useful and popular.
Small gadgets such as smart watches can be frustrating to use because their tiny buttons and touch screens are tricky to operate. Google has two possible solutions for the fat finger problem: control your gadgets by subtly rubbing your finger and thumb together, or by swiping a grid of conductive yarn woven into your clothing.
The first of those two ideas works thanks to a tiny radar sensor that could be integrated into, say, a smart watch and can detect fine motions of your hands from a distance and even through clothing. Levi Strauss announced today that it is working with Google to integrate fabric touch panels into its clothing designs. The new projects were announced at Google’s annual developer conference in San Francisco Friday by Ivan Poupyrev, a technical program lead in Google’s Advanced Technology and Projects research group.
The current prototype of Google’s radar sensor is roughly two centimeters square. It can pick up very fine motions of your hands at distances from five centimeters up to five meters.
Poupyrev showed how he could circle his thumb around the tip of his forefinger near the sensor to turn a virtual dial. Swiping his thumb across his fingertip repeatedly scrolled through a list.
“You could use your virtual touchpad to control the map on the watch, or a virtual dial to control radio stations,” said Poupyrev. “Your hand can become a completely self-contained interface control, always with you, easy to use and very, very, ergonomic. It can be the only interface control that you would ever need for wearables.”
Poupyrev also showed how he could perform the same motion in different places to control different things. He used the scrolling gesture to adjust the hour on a digital clock, then moved his hand about a foot higher and used the same motion to adjust the minutes.
No details were given on what kind of devices the radar sensor might be built into. But Poupyrev did say the sensors can be mass produced, and he showed a silicon wafer, made by the chip company Infineon, covered in many of the devices.
Google’s woven touch sensor technology is based on a new way to make conductive fiber developed by Poupyrev and colleagues as part of an effort that Google is calling “Project Jacquard.” Conductive yarn was already on the market, but only in the color gray, he said. Google has developed a way to braid slim copper fibers with textile fibers of any color to make conductive yarn that can be used in existing fabric and garment factories just like yarns they use today, said Poupyrev.
“We want to make interactive garments at scale so everyone can make them and everyone can buy them,” he said. Poupyrev showed images of stretchable and semi-transparent fabrics with the touch-detecting yarn woven in.
Rather than being an alternative to a conventional touch screen, the textile touch panels are intended to provide a quicker and subtler way to interact with a phone in your pocket or device on your wrist, for example, to dismiss a notification.
Poupyrev waved his hand over what looked like a swatch of ordinary fabric to show how a grid of conductive yarn woven into it could detect the presence of his hand and also when he touched it with a finger. It could also track two finger touches at the same time.
Levi Strauss has agreed to work with Google on integrating the technology into clothing, but no details were given about when touch-responsive clothing might become available to buy.
Poupyrev said they are still working on how to best integrate the electronics, wireless communications, and batteries into a textile touch panel. The only demonstration of how the technology might operate in a garment came in a video in which a Savile Row tailor made a jacket with a touch-responsive patch above the cuff on one sleeve. When a finger swiped across the panel, a nearby smartphone made a call.
The AI revolution is here. Will you lead or follow?
Join us at EmTech Digital 2019.