Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

From today’s perspective, it seems clear that Gordon Moore got lucky. Back in 1965, Electronics magazine asked Moore-then research director of electronics pioneer Fairchild Semiconductor-to predict the future of the microchip industry. At the time, the industry was in its infancy; Intel, now the world’s biggest chip-maker, would not be founded (by Moore, among others) for another three years. Because few chips had been manufactured and sold, Moore had little data to go on. Nonetheless, he confidently argued that engineers would be able to cram an ever-increasing number of electronic devices onto microchips. Indeed, he guessed that the number would roughly double every year-an exponential increase that has come to be known as Moore’s Law.

At first, few paid attention to Moore’s prediction. Moore himself admitted that he didn’t place much stock in it-he had been “just trying to get across the idea [that] this was a technology that had a future.” But events proved him right. In 1965, when Moore wrote his article, the world’s most complex chip was right in his lab at Fairchild: It had 64 transistors. Intel’s new-model Pentium III, introduced last October, contains 28 million transistors. “The sustained explosion of microchip complexity-doubling year after year, decade after decade,” Lillian Hoddeson and Michael Riordan write in Crystal Fire, their history of the transistor, “has no convenient parallel or analogue in normal human experience.”

The effect of Moore’s Law on daily life is obvious. It is why today’s $3,000 personal computer will cost $1,500 next year and be obsolete the year after. It is why the children who grew up playing Pong in game arcades have children who grow up playing Quake on the Internet. It is why the word-processing program that fit on two floppy disks a decade ago now fills up half a CD-ROM-in fact, it explains why floppy disks themselves have almost been replaced by CD-ROMs, CD-Rs and CD-RWs.

But these examples, as striking as they are, may understate the importance of Moore’s Law. The United States is experiencing the longest economic boom since the 1850s, when the federal government first began collecting economic statistics systematically. The current blend of steady growth and low inflation is so unusually favorable that many economists believe the nation is undergoing fundamental change. And the single most important factor driving the change, these economists say, is the relentless rise in chip power. “What’s sometimes called the ‘Clinton economic boom,’” says Robert Gordon, an economist at Northwestern University, “is largely a reflection of Moore’s Law.” In fact, he says, “the recent acceleration in productivity is at least half due to the improvements in computer productivity.”

If Gordon is right, it is unfortunate that just as economists are beginning to grasp the importance of Moore’s Law, engineers are beginning to say that it is in danger of petering out.

The age of digital electronics is usually said to have begun in 1947, when a research team at Bell Laboratories designed the first transistor. But Moore’s Law, the driving force of the digital era, is pegged to another, lesser-known landmark: the invention of the integrated circuit. John Bardeen, Walter Brattain and William Shockley won a Nobel Prize for the transistor. Jack Kilby, the Texas Instruments engineer who came up with the integrated circuit, didn’t win anything. But in many ways it was his creation, not the transistor, that most shook the world.

In May 1958 Kilby was hired by Texas Instruments, the company that pioneered the silicon transistor. The company had a mass vacation policy; almost everyone was thrown out of the office for the first few weeks in July. Being newly hired, Kilby had no time off. He found himself almost alone in the deserted plant.

Pages

1 comment. Share your thoughts »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me