While most news from the Consumer Electronics Show is about new products, some early stage technology that could appear in future generations of products also gets unveiled. This year several companies demonstrated prototype technology that could make future touch screens flexible, curved, or even capable of sprouting physical buttons for certain tasks.
One of the most striking ideas on show was PaperTab, an early prototype of a tablet computer flexible enough to roll up like a newspaper. PaperTab was created by the Human Media Lab at Queen’s University, Canada, with the assistance of Plastic Logic, a U.K. company that makes flexible display technology. In Las Vegas, Aneesh Tarun and Roel Vertegaal from Queen’s showed off the prototypes, which still require tethering to a power source.
The Kingston duo combined an off-the-shelf touch-sensing sheet with a black and white display from Plastic Logic that was 0.3 millimeters thick to make PaperTab. They included sensors capable of detecting when a corner of the device was bent, so that the flexing gesture could be used to turn the pages of an e-book.
The Three PaperTabs on display in Vegas also could be used together: when one showed an e-mail inbox, tapping an individual e-mail in that list with the corner of a second PaperTab opened the message on the second device. Flexing the corner of that second screen switched to composing a reply, and tapping a third device displaying a photo to the message attached that image to the e-mail. “It solves a cognitive problem that paper or conventional computing devices don’t,” says Vertegaal. “This is the world’s first true paper-like computer.”
Plastic Logic’s displays are made by printing with conductive polymer compounds. It had previously sold its own devices based on this technology, including the Que e-reader. But in May 2012, the company switched its strategy and began trying to encourage established gadget makers to adopt its technology. Although the PaperTabs are black and white, Plastic Logic also makes color flexible displays, and showed the latest version of this technology, which is 0.16 millimeters thick, at CES.
Synaptics, which provides trackpads in many laptops and other devices, showed a prototype tablet that senses touches on both the front and the back of its screen. Adding multi-touch sensors to the back of the device could allow for new means of control, such as swiping the back of a tablet to scroll or flip pages. “We can enable apps to make use of the extra data from these sensors on the back,” said Andrew Tsu, a technology strategist at Synaptics.
Hsu showed how an e-book-reading app on the prototype tablet could distinguish a “grip” touch from one intended as a control gesture, responding by flowing text around a thumb on the display so it didn’t block any words. The screen of Apple’s iPad mini is close enough to the edge that many users may encounter the problem Synaptics is targeting, says Hsu.
Atmel, based in San Jose, showed another technology that could change the way mobile devices respond to touch. The company’s new Xsense touch sensors can be conformed to more severe bends and corners than those used in existing devices, making it possible for touch-sensitive areas to extend beyond a display and around the edges of a device. That might allow new forms of interaction, triggered by touching a device’s edges. Scrolling, for example, could be achieved by sliding a finger along a phone’s edge so the action doesn’t obscure any part of the screen.
This approach might also allow designers to remove physical buttons, such as the volume rockers on many phones. Glass manufacture Corning has recently started offering curved shapes of its tough Gorilla Glass used in many mobile devices. Atmel showed a mock phone with both Gorilla Glass and an Xsense touch sensor wrapped around its edge.
Finally, Tactus, based in Fremont, California, unveiled a prototype tablet with a screen that can sprout buttons from its flat surface, which measured seven inches along the diagonal (see “A Shape-Shifting Smartphone Touch Screen”). Buttons rise from the surface of a Tactus screen when liquid is pumped into reservoirs inside the display, and then disappear when the fluid is withdrawn. The company hopes to license its technology to device manufacturers interested in bringing a more tactile experience to touch screens.
These weird virtual creatures evolve their bodies to solve problems
They show how intelligence and body plans are closely linked—and could unlock AI for robots.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
Chinese hackers disguised themselves as Iran to target Israel
But they left a few clues that gave them away.
DeepMind says it will release the structure of every protein known to science
The company has already used its protein-folding AI, AlphaFold, to generate structures for the human proteome, as well as yeast, fruit flies, mice, and more.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.