MIT Technology Review Subscribe

The Integrator

Robert Noyce dreamed up the microchip in a 1959 notebook entry.

The revelatory moment of the electronics age arguably came in January 1959, when Robert Noyce, an engineer and a founder of Fairchild Semiconductor, scrawled in his notebook the words “Methods of Isolating Multiple Devices.” Under that obscure heading, Noyce went on to write, “In many applications now it would be desirable to make multiple devices on a single piece of silicon in order to be able to make interconnections between devices as part of the manufacturing process, and thus reduce size, weight, etc., as well as cost per active element.”

Although the word for it did not yet exist, Noyce was describing the microchip. A former protege of William Shockley, the coinventor of the transistor, Noyce understood the transformative potential of new technology as well as anyone alive. His halting follow-up on his initial idea therefore casts light not just on the history of computers but on the often befogged pathways that lead to scientific advancement.

Advertisement

As Leslie Berlin, a visiting scholar at Stanford University, relates in her new biography, The Man behind the Microchip: Robert Noyce and the Invention of Silicon Valley, “After noting his ideas in his lab notebook, Noyce did…nothing.”

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Fairchild was a new company, and, as Noyce later recalled, he was preoccupied with selling transistors, not with inventions “that might make you some money somewhere down the road.” Noyce did not “invent” the chip to create something new but to solve an existing problem in an industrial process.

The problem was that circuits consisted of numerous discrete components (transistors, resistors, and so forth) requiring thousands of interconnections. Electronics users configured their own circuits by attaching these components to each other one at a time, “a process fraught,” Berlin tells us, “with errors and failures.” As the number of interconnections rose, so did the odds of system failure. By the late 1950s, a score of companies were looking for a solution.

Two months after Noyce’s notebook entry, Texas Instruments announced that one of its engineers, Jack Kilby, had invented a crude integrated circuit. This may have been the spark that inspired Noyce to return to his notebook. In July, five months after Kilby, Noyce filed a patent on an integrated circuit. Though Kilby was first, he merely placed all the components on a single slab of germanium and wired them together the standard way – by hand. Noyce’s design was easier to mass-produce. His integrated circuit connected components in a single circuit on a chip of silicon that was small enough, as Berlin writes, to be “carried off by an ant.”

Berlin’s rigorously factual account portrays the scientific process in all its grittiness. Not only were the events that led to the Fairchild integrated circuit “murky” (Noyce was inspired by the work of one of his colleagues, Jean Hoerni), but after the fact, the engineers failed to realize what they had wrought. Some executives within Fairchild were opposed to investing in the commercial development of integrated circuits on the grounds that they were prohibitively expensive and threatened transistor sales.

But Fairchild didn’t quite give up. In 1961, it did launch a primitive integrated circuit dubbed the Micrologic, though the $100 price tag limited demand. Finally, in 1964, Noyce made a bold decision: to cut the price of the circuit below what it was costing Fairchild’s customers to buy and then solder the individual components themselves.

Once the chip became economical to purchase, sales took off. Fellow Fairchild founder Gordon Moore later said the decision to cut prices was as important as the invention itself. It established a pattern for Silicon Valley that still endures. As Moore put it, “Whenever there’s a problem, you lower the price.” By 1965, Noyce could see the future. He told a group of financial analysts to get ready for portable telephones, personal paging systems, and palm-sized televisions.

In 1968, Noyce and Moore bolted from Fairchild and founded Intel. There, Noyce rather sadly became a front man and eventually a figurehead. Berlin does not spare us the depiction of Noyce’s shortcomings, including the details of his troubled first marriage. After Intel, he became a lobbyist for the semiconductor industry – not the finale one envisions for a legend, but in keeping with Noyce’s modest self-appraisal.

Advertisement

He was often asked when he would win the Nobel Prize. “They don’t give Nobel Prizes for engineering,” he would say with a smile. Noyce died in 1990. Had he lived, he undoubtedly would have shared the stage with Kilby, who in 2000 did indeed win a Nobel in physics for ushering in the age of computers.

Chip Maker

The Man behind the Microchip: Robert Noyce and the Invention of Silicon Valley
By Leslie Berlin

Oxford University Press 2005, $30.00

This is your last free story.
Sign in Subscribe now

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement