Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

The mouse and keyboard have been how nearly everyone has interacted with computers since Apple and Microsoft brought them to the mainstream in the mid 1980s. There have been periodic attempts to replace these devices, mainly on the grounds that because they are so old, there must be something better by now.

But as the speakers on a panel on the future of the human-computer interface at the Consumer Electronics Show pointed out, the mouse and keyboard haven’t changed much because how people use computers haven’t changed much. We still typically use computers with a relatively large screen on some kind of tabletop. The panelists were drawn from organizations such as Microsoft’s mobile division, HP, and Sony’s Playstation group.

However, now that mobile computing is firmly established, moving people away from the desktop, and new applications are driving new interfaces: smartphones with touch-screens, voice-controlled automotive entertainment systems, and motion-based game controllers. The struggle for interface designers is to establish some kind of common grammar to all these systems, so that people can move seamlessly from device to device without having to learn how to operate each one individually. This is as much as marketing challenge as as technical one: however intuitive it might feel today, people had to be taught that making pinching motions on a screen equaled zooming in and out.

Looking towards the future, it’s likely that application designers will have to start taking into account contextual shifts between different interfaces on the same device: for example, a navigation application on a smartphone could be designed for a touch-based interface, but if a user starts driving a car, the application should be able to switch over to voice-based input and output, possibly tapping into the car’s built-in hands-free phone system. Beyond that? Maybe mind control, the panel suggested, tapping electrical impulses to control the computers around us, although they admitted this is still a long way from mainstream adoption.

2 comments. Share your thoughts »

Tagged: Computing, CES, interfaces, CES 2011

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me