Every year since 1960, when the integrated circuit industry began, the number of components per chip has about doubled. This phenomenal rate of progress has brought us to the era of “very large scale integrated” (VLSI) circuits. Today, over 150,000 components can be fabricated and interconnected on a single silicon chip about one-tenth the size of a postage stamp, and the number of components per chip can be expected to grow dramatically for at least another 10 to 15 years.
So it went in 1981, when John S. Mayo, a researcher and executive at Bell Labs who was part of the team that built the first computer to use transistors instead of vacuum tubes, described how VLSI was making powerful microelectronics possible. What Mayo described is known as Moore’s Law–the prediction, in 1965, by Intel cofounder Gordon Moore that the number of components on a chip would double every two years. (Initially, Moore predicted an annual doubling.) In the 29 years since Mayo’s essay, Moore’s Law has held up nicely: the latest chips feature more than two billion components.
Even then Mayo saw that the microprocessor industry would maintain its level of innovation. He saw that engineers could use the power of the computer chips themselves to design chips that were even more powerful. He understood that complex, efficient computers could be used to design computers that were even more complex and efficient.
A primary example is computer-aided design (CAD). In the last five years, integrated circuits have become so complex that without the advanced analyses and extensive simulation techniques available through CAD, it would be virtually impossible to design VLSI chips. But with CAD, even complex circuits can be designed in a few months with no real increase in labor.
The advances in fabrication and design drove down the cost of a digital logic gate to the point where the average person in 1981 could buy a cheap pocket calculator to do the same work that previously would have required a huge, expensive computer. Microprocessors were on their way to becoming ubiquitous.
Some discrete high-frequency components have been built with submicron dimensions, and there are indications that dimensions in the tens of nanometers (one-billionth of a meter) are technologically feasible. Such dimensions are 100 times smaller (10,000 times smaller in area) than the current chips and may lead to packing more than 1 billion components on each chip.
Mayo had joined Bell Labs in 1955, eight years after the first transistor had been invented there, so he had an insider’s view of what cramming a billion transistors onto a chip would mean. He was especially excited about distributed computing–that is, networking–and telecommunications.
VLSI will make possible systems even more complex in hardware and simpler in software, opening a new area of software science–distributed software systems, wherein a whole family of computers of many sizes operates under centralized software control. …
The impact of microelectronics on communications is nearly as profound. … Electronic switchers now make it possible to forward calls automatically, to reach frequently called numbers through abbreviated dialing codes, to notify users of other incoming calls, and to conduct three-way conference calls.
The full implications for communications were beyond anything Mayo could have predicted. Cell phones, wireless networks, downloadable apps–in other words, many of the things we take for granted (see Briefing)–were all made possible by the increasing power and availability of microprocessors. And yet Mayo clearly sensed the developments to come, even pointing out an early foray by the media into interactive content:
Other communications concepts are on the horizon. Advanced Mobile Telephone Service can provide service to large numbers of people in vehicles and is working well on a trial basis in Chicago. And the VIEWTRON system, on trial in Coral Gables, Fla., enables one to display on a television screen some 15,000 “frames” of information–some interactive–transmitted via telephone lines from a data bank.
The big new idea for making self-driving cars that can go anywhere
The mainstream approach to driverless cars is slow and difficult. These startups think going all-in on AI will get there faster.
Inside Charm Industrial’s big bet on corn stalks for carbon removal
The startup used plant matter and bio-oil to sequester thousands of tons of carbon. The question now is how reliable, scalable, and economical this approach will prove.
The dark secret behind those cute AI-generated animal images
Google Brain has revealed its own image-making AI, called Imagen. But don't expect to see anything that isn't wholesome.
The hype around DeepMind’s new AI model misses what’s actually cool about it
Some worry that the chatter about these tools is doing the whole field a disservice.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.