Large touch-screen tables have emerged as a useful way for several people to collaborate on projects like video editing or graphic design, but often these tasks require fine controls that can be difficult to simulate on a touch surface with limited resolution. When a person needs precision, it may be best to use a physical controller instead, says Dan Morris, a researcher at Microsoft.
Morris and his colleagues have developed software for touch-screen surfaces that allows physical controls to be added to them. In addition, the software lets people define the functions that each knob, button, and slider on a controller will perform.
The researchers’ system, called Ensemble, was presented on Monday at the Computer-Human Interaction (CHI 2009) Conference in Boston. It consists of a touch table, made by former Microsoft intern Bjoern Hartmann, which is six feet long and four feet wide, and several portable sound-editing controllers that connect to the computer that controls the surface. The table is similar to Microsoft’s Surface, but larger. As with Surface, cameras underneath the tabletop are used to sense when a user touches the surface or when an object is placed on top of it.
The idea of incorporating traditional input devices like mouses or keyboards with a touch display is not new, but the Microsoft researchers show with Ensemble that it’s possible to make hardware do more than a single specified task.
Cameras within the Ensemble table detect a special tag on the bottom of each audio control box to recognize each box and determine its position on the surface. The software then produces an “aura” around each device, including touch-surface controls like “play,” “pause,” and “stop,” and virtual sliders that correspond to physical knobs on the box.
A person can then edit a music track, for example, using both the physical device and the touch-surface controls. The virtual sliders can be used to zoom in on the audio waveform of a track, or to go to a different location on the waveform by panning. The physical knobs on the box perform the same function but offer much finer control. The system also allows a person to change the function of the knobs to, say, control the volume of a trumpet track instead.
“It’s a software mechanism for telling the hardware what to do,” says Morris. He explains that once a person has mapped different functions onto the controller, she’s able to save it for later or pass it along to someone else who has a similar role in the editing process.
The paper, presented at CHI 2009 by Rebecca Fiebrink, a graduate student at Princeton University, also describes a study examining how people used the interface. Most of the study participants used the physical controls, favoring the accuracy and responsiveness that they offer. However, these participants also made extensive use of surface controls, choosing them mainly for tasks in which a single touch produced a discrete result, such as playing or stopping a track.
Robert Jacob, a professor of electrical engineering at Tufts University, in Medford, MA, says that the researchers “did a nice job of investigating what users actually did when given both [physical controllers and a touch screen] and the opportunity to switch between them.”
Jacob, who chaired the session in which the paper was presented, acknowledges that bridging the gap between physical and digital objects can be challenging. “It’s a difficult problem with no general solutions, but rather individual interesting designs,” he says. “Ideally, you want the benefits of the digital without giving up those of the physical.”
While Ensemble was designed for sound editing, its underlying technology could find other applications in graphics, gaming, and visual design, says Morris. “It could be used in scenarios where you want people to collaborate on a surface as a group,” he says, but where the resolution of touch surface limits the precision of the virtual controls.