Touch
screen interfaces may be trendy in gadget design, but that doesn’t mean they do
everything elegantly. The finger is simply too blunt for many tasks. A new
interface, called Manual Deskterity, attempts to combine the strengths of touch
interaction with the precision of a pen.
“Everything,
including touch, is best for something and worse for something else,” says Ken Hinckley,
a research scientist at Microsoft who is involved with the project, which will
be presented this week at the ACM Conference on Human Factors
in Computing Systems (CHI) in
Atlanta.
The
prototype for Manual Deskterity is a drafting application built for the
Microsoft Surface, a tabletop touchscreen. Users can perform typical touch
actions, such as zooming in and out and manipulating images, but they can also
use a pen to draw or annotate those images.
Advertisement
The
interface’s most interesting features come out when the two types of
interaction are combined. For example, a user can copy an object by holding it
with one hand and then dragging the pen across the image, “peeling”
off a new image that can be placed elsewhere on the screen. By combining pen
and hand, users get access to features such as an exacto knife, a rubber stamp,
and brush painting.
This story is only available to subscribers.
Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.
Hinckley
says the researchers videotaped users working on visual projects with
sketchbooks, scissors, glue, and other typical physical art supplies. They
noticed that people tended to hold an image with one hand while making
notes about it or doing other work related to it with the other. The researchers decided to
incorporate this in their interface–touching an object onscreen with a free hand
indicates that the actions performed with the pen relate to that object.
Hinckley
acknowledges that the interface includes a lot of tricks that users need to
learn. But he thinks this is true of most interfaces. “This idea that
people just walk up with an expectation of how a [natural user interface]
should work is a myth,” he says.
Hinckley
believes that natural user interfaces can ease the learning process by engaging
muscle memory, rather than forcing users to memorizes sequences of commands or
the layout of menus. If the work is successful, Hinckley says it will show how
different sorts of input can be used in combination.
Hinckley
also thinks it’s a mistake to focus on devices that work with touch input
alone. He says, “The question is not, ‘How do I design for touch?’ or ‘How
do I design for pen?’ We should be asking, ‘What is the correct division of
labor in the interface for pen and touch interactions such that they complement
one another?’”
The
researchers plan to follow up by adapting their interface to work on mobile
devices.