Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

The following document arrived at the offices of Technology Review in a time capsule dated 2020. It purports to be a history of computers written by computer scientist-turned-historian John Seely Brown. In the late 20th century, Dr. Brown served as director of Xerox Corporation’s Palo Alto Research Center.

The history of computers is actually quite simple. In the beginning there were no computers. Then there were computers. And then there were none again. Between the second and the third stage, they simply disappeared. They didn’t go away completely. First they faded into the background. Then they actually merged with the background.

These different stages of computing came to be known in terms of their central motifs: The initial stage after they emerged from the back rooms into the public was the era of personal computing, which spanned the 1980s and early 1990s. With the advent of the Internet and the World Wide Web, this era seamlessly became the age of social computing, sometimes called ubiquitous computing, which began in the mid-1990s and lasted some two decades. This age was characterized by millions of computers, information appliances and storage devices that were interconnected–creating a vast information medium that supported all kinds of communities of interest. This new medium offered access to nearly any information residing anyplace in the world.

Roughly 15 years into the 21st century, the social computing stage morphed into the period called ecological or symbiotic computing. Structural matter (atoms) and computing (bits) became inseparable. Zillions of sensors, effectors and logical elements (made of organic and inorganic materials) were interconnected via wireless, peer-to-peer technologies, producing smart, malleable stuff used to build smart appliances, buildings, roads and more. It was during this era that computers disappeared. In their place, nearly every physical artifact harbored some computationally based brainpower that helped it know where it was, what was near it, when it was moved and so on. In a way, the inorganic world took on organic properties, using computing to transparently modulate responses to the environment.

But how did this come to be? During the personal computing stage, computers became increasingly powerful, but they also became harder to use. Moore’s Law, stating that computing power would double every 18 months, seemed to hold for hardware. But robust software never could keep up. The result was that personal computers remained hard to use. The graphical user interfaces of the 1980s, at least, made systems somewhat manageable. But even that degree of usability faded in the second era of computing, when designers tried to extend this interface motif to navigating the vast information and document spaces of the Web. Those who surfed the Net all day long just ended up feeling disoriented or lost. More casual users felt overwhelmed with the volumes of irrelevant information given them by their intelligent agents, or “bots” (as these were often called at the turn of the 21st century).

Eventually the Web became a jungle of information pathways with no cues to help folks to their destinations, much like the center of a megacity without reliable signs or guides. Urban architects and social theorists were called on to help technologists see the resources that lay latent in the social and physical context. Humans, it was pointed out, used the context around objects and events to navigate the world and get things done. For example, they found out what was worth reading when a friend recommended a book or when they heard about an important article at work.

It turned out that interaction with other people was the key. Humans wanted technology to help them keep better connected to each other and to enhance their awareness of events around them. But they didn’t want to have to attend to every little thing; all they wanted was a virtual awareness that would take place subconsciously, much like how the visual system works in the physical world.

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me