Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

Mouse/Graphical User Interface

“When I started with the mouse, very few were taking seriously that people would want to work online at a computer display,” says Douglas Engelbart, who invented the mouse and graphical user interface (GUI) in the 1960s. His mouse/GUI combination, further developed at Xerox Palo Alto Research Center (PARC) in the 1970s and popularized by Apple in the 1980s, made a computer’s contents visible. Before that, to edit a computer file, you had to remember its name and location. The reduced demand on short-term memory, combined with a visual-spatial environment users enjoyed, converted the computer display into a workspace. In his book Interface Design, Steven Johnson says Engelbart’s invention “probably had more to do with popularizing the digital revolution than any other software advance.”

Making everything visual rather than linguistic, however, means that semantically complex commands get left in the dust. With a command-line operating system (remember DOS?) a task such as making a copy of every file ending in “.txt” took a few keystrokes. A GUI offers no shortcut. Engelbart, whose original interface put a mouse in one of the user’s hands and a special “chording” keyset in the other, thinks today’s GUI is awfully primitive: “Here’s the language they’re proposing: You point to something and grunt.” Our cave-dwelling ancestors would have understood.

Contribution to the lexicon: “Point-and-click”


Barcode Scanner

In February 1992, George Bush was given a demo of a supermarket barcode scanner made by NCR. His response? “That’s amazing!” Contrary to news accounts of the incident, however, he wasn’t wowed by the mere existence of scanner technology, which has been around since 1974. He was marveling at a new, improved version that was able to read a barcode torn into seven pieces.

Scanners have come a long way since the first 10-pack of Juicy Fruit gum was scanned at Marsh Supermarket in Troy, Ohio. The initial draw for companies was accuracy of data entry: Barcode readers made a lot fewer errors than cashiers. But lurking in the laser’s capability was the potential to collect vast amounts of information-on what products are selling, and when, and in what combinations. Says Craig Maddox, product line director for barcode scanners at NCR, “it was a good 15 years before the grocery industry started to use the data.” Nowadays, retailers compile terabyte-sized databanks of every transaction in their stores and sell it back to vendors; barcodes have also speeded communication across the whole supply chain so much, remarks Maddox, that “some stores…don’t pay for the product until it’s already sold.”

Contribution to the lexicon: “Scan it”

0 comments about this story. Start the discussion »

Tagged: Web

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »