Tabletop touch screens such as Microsoft’s Surface are designed for sharing and collaboration, but it’s difficult for them to tell one person from another. Researchers in the U.K. have developed a new way to identify different users: via mobile phones.
The prototype system, called PhoneTouch, lets users manipulate onscreen objects, such as photos, or select buttons, by touching any part of their phone to the screen. This also makes it possible to personalize interactions, says Hans Gellersen, a professor of interactive systems at the University of Lancaster, who developed the system with his student Dominik Schmidt.
PhoneTouch also makes it possible to transfer files between the phone and the surface. “Surfaces in general are good for working together in parallel,” says Gellersen. “But when people work together they also want to bring information into the group.”
PhoneTouch uses a camera positioned beneath the surface to recognize finger contact. The system can also discern the pattern made when the edge of a phone touches the surface. “The phone gives a different visual blob than the finger,” says Gellersen.
To identify which phone is in contact with the surface, the PhoneTouch interrogates the accelerometers built into connected phones to see which of them experienced a slight bump at precisely the moment of contact. “These two events are correlated in time,” he says. This is an approach known as separate event detection.
“It’s very clever,” says Eva Hornecker, who studies the usability of touch surfaces at Strathclyde University. “Normally surfaces don’t know who’s who.” PhoneTouch could, perhaps, ensure that files taken from a phone can be shared with others, but without allowing anyone else to alter or save them, Hornecker notes.
Separate event detection is already used by the popular smart phone app Bump, which lets users exchange information by shaking, or “bumping,” two phones close together. PhoneTouch differs in that it allows the pairing of a personal device with a shared device, says Gellersen. “PhoneTouch not only establishes a connection but allows the Phone to be used as a stylus on the surface, to select specific widgets.”
Schmidt says that there is a slim chance that the system will be confused when two phones touch it simultaneously. But a user study has shown that it identifies the correct device 99.99 of the time. The researchers will present the work at the User Interface Software and Technology symposium in New York this week.
“It’s a great idea,” says Rob Miller, head of the User Interface Design Group at MIT. “The traditional desktop approach of username and password entry doesn’t make much sense on a multitouch tabletop,” because text entry is less natural.
Mitsubishi Electric has demonstrated a touch surface called DiamondTouch that uses sensors in chairs to match touches to different people. But using a phone may be more convenient. “Mobile phones are already ubiquitous–people carry them anyway.” Miller agrees. “Phones are very personal. We almost always have them with us.”
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
How to fix the internet
If we want online discourse to improve, we need to move beyond the big platforms.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.