New Computer Interface Goes Beyond Just Touch
“Everything, including touch, is best for something and worse for something else,” says Ken Hinckley, a research scientist at Microsoft who is involved with the project, which will be presented this week at the ACM Conference on Human Factors in Computing Systems (CHI) in Atlanta.
The prototype for Manual Deskterity is a drafting application built for the Microsoft Surface, a tabletop touchscreen. Users can perform typical touch actions, such as zooming in and out and manipulating images, but they can also use a pen to draw or annotate those images.
The interface’s most interesting features come out when the two types of interaction are combined. For example, a user can copy an object by holding it with one hand and then dragging the pen across the image, “peeling” off a new image that can be placed elsewhere on the screen. By combining pen and hand, users get access to features such as an exacto knife, a rubber stamp, and brush painting.
Hinckley says the researchers videotaped users working on visual projects with sketchbooks, scissors, glue, and other typical physical art supplies. They noticed that people tended to hold an image with one hand while making notes about it or doing other work related to it with the other. The researchers decided to incorporate this in their interface–touching an object onscreen with a free hand indicates that the actions performed with the pen relate to that object.
Hinckley acknowledges that the interface includes a lot of tricks that users need to learn. But he thinks this is true of most interfaces. “This idea that people just walk up with an expectation of how a [natural user interface] should work is a myth,” he says.
Hinckley believes that natural user interfaces can ease the learning process by engaging muscle memory, rather than forcing users to memorizes sequences of commands or the layout of menus. If the work is successful, Hinckley says it will show how different sorts of input can be used in combination.
Hinckley also thinks it’s a mistake to focus on devices that work with touch input alone. He says, “The question is not, ‘How do I design for touch?’ or ‘How do I design for pen?’ We should be asking, ‘What is the correct division of labor in the interface for pen and touch interactions such that they complement one another?’”
The researchers plan to follow up by adapting their interface to work on mobile devices.
Keep Reading
Most Popular
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.