Skip to Content

New Computer Interface Goes Beyond Just Touch

Manual Deskterity combines a touch with the trusty pen.
April 12, 2010
Touch screen interfaces may be trendy in gadget design, but that doesn’t mean they do everything elegantly. The finger is simply too blunt for many tasks. A new interface, called Manual Deskterity, attempts to combine the strengths of touch interaction with the precision of a pen.

“Everything, including touch, is best for something and worse for something else,” says Ken Hinckley, a research scientist at Microsoft who is involved with the project, which will be presented this week at the ACM Conference on Human Factors in Computing Systems (CHI) in Atlanta.

The prototype for Manual Deskterity is a drafting application built for the Microsoft Surface, a tabletop touchscreen. Users can perform typical touch actions, such as zooming in and out and manipulating images, but they can also use a pen to draw or annotate those images.

The interface’s most interesting features come out when the two types of interaction are combined. For example, a user can copy an object by holding it with one hand and then dragging the pen across the image, “peeling” off a new image that can be placed elsewhere on the screen. By combining pen and hand, users get access to features such as an exacto knife, a rubber stamp, and brush painting.

Hinckley says the researchers videotaped users working on visual projects with sketchbooks, scissors, glue, and other typical physical art supplies. They noticed that people tended to hold an image with one hand while making notes about it or doing other work related to it with the other. The researchers decided to incorporate this in their interface–touching an object onscreen with a free hand indicates that the actions performed with the pen relate to that object.

Hinckley acknowledges that the interface includes a lot of tricks that users need to learn. But he thinks this is true of most interfaces. “This idea that people just walk up with an expectation of how a [natural user interface] should work is a myth,” he says.

Hinckley believes that natural user interfaces can ease the learning process by engaging muscle memory, rather than forcing users to memorizes sequences of commands or the layout of menus. If the work is successful, Hinckley says it will show how different sorts of input can be used in combination.

Hinckley also thinks it’s a mistake to focus on devices that work with touch input alone. He says, “The question is not, ‘How do I design for touch?’ or ‘How do I design for pen?’ We should be asking, ‘What is the correct division of labor in the interface for pen and touch interactions such that they complement one another?’”

The researchers plan to follow up by adapting their interface to work on mobile devices.

Keep Reading

Most Popular

10 Breakthrough Technologies 2024

Every year, we look for promising technologies poised to have a real impact on the world. Here are the advances that we think matter most right now.

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

AI for everything: 10 Breakthrough Technologies 2024

Generative AI tools like ChatGPT reached mass adoption in record time, and reset the course of an entire industry.

What’s next for AI in 2024

Our writers look at the four hot trends to watch out for this year

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.