Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

The revelatory moment of the electronics age arguably came in January 1959, when Robert Noyce, an engineer and a founder of Fairchild Semiconductor, scrawled in his notebook the words “Methods of Isolating Multiple Devices.” Under that obscure heading, Noyce went on to write, “In many applications now it would be desirable to make multiple devices on a single piece of silicon in order to be able to make interconnections between devices as part of the manufacturing process, and thus reduce size, weight, etc., as well as cost per active element.”

Although the word for it did not yet exist, Noyce was describing the microchip. A former protege of William Shockley, the coinventor of the transistor, Noyce understood the transformative potential of new technology as well as anyone alive. His halting follow-up on his initial idea therefore casts light not just on the history of computers but on the often befogged pathways that lead to scientific advancement.

As Leslie Berlin, a visiting scholar at Stanford University, relates in her new biography, The Man behind the Microchip: Robert Noyce and the Invention of Silicon Valley, “After noting his ideas in his lab notebook, Noyce did…nothing.”

Fairchild was a new company, and, as Noyce later recalled, he was preoccupied with selling transistors, not with inventions “that might make you some money somewhere down the road.” Noyce did not “invent” the chip to create something new but to solve an existing problem in an industrial process.

The problem was that circuits consisted of numerous discrete components (transistors, resistors, and so forth) requiring thousands of interconnections. Electronics users configured their own circuits by attaching these components to each other one at a time, “a process fraught,” Berlin tells us, “with errors and failures.” As the number of interconnections rose, so did the odds of system failure. By the late 1950s, a score of companies were looking for a solution.

Two months after Noyce’s notebook entry, Texas Instruments announced that one of its engineers, Jack Kilby, had invented a crude integrated circuit. This may have been the spark that inspired Noyce to return to his notebook. In July, five months after Kilby, Noyce filed a patent on an integrated circuit. Though Kilby was first, he merely placed all the components on a single slab of germanium and wired them together the standard way – by hand. Noyce’s design was easier to mass-produce. His integrated circuit connected components in a single circuit on a chip of silicon that was small enough, as Berlin writes, to be “carried off by an ant.”

1 comment. Share your thoughts »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »