For the semiconductor industry, what started out as a cute description of a technology trend has become something like a force of nature. It’s called Moore’s Law. In fact, it’s not a law at all, but a rule of thumb that Silicon Valley pioneer Gordon Moore cooked up back in the 1960s. Moore, co-founder of Intel, noticed that the number of transistors being packed into integrated circuits was doubling every year, and he predicted this trend would continue.
The strange thing is, it turned out to be more or less true. Moore’s observation has become the de facto law behind the meteoric rise of the computing industry. Ever since 1975, the number of transistors on a semiconductor chip has doubled roughly every 18 months, enabling microprocessors and memories to get larger and more complex-and far cheaper. Powering this trend is the shrinking of transistors with each chip generation.
This rapid diminution of microelectronics makes possible today’s information revolution. For chip manufacturers, the pace has been brutal but lucrative. “Throughout the 1980s,” says Paolo Gargini, director of Intel’s strategic research, new generations of chips “were on a three-year cycle. But in the 1990s, we’ve begun moving to a two-year cycle.”
But the joyride can’t continue forever-at least not with the technology now in use to make electronics. According to the latest roadmap charted by the Semiconductor Industry Association (SIA), the minimum size of features in integrated circuits will have to hit 130 nanometers by 2003 to keep pace with Moore’s Law. That should be doable by adapting existing fabrication methods. But after 2003-the deluge. Following that year on the SIA roadmap is a lot of red, the color used by the trade association to indicate a lack of consensus about how to solve the fabrication challenges looming beyond the 100-nanometer barrier.
The problem is that the dominant technology used to make chips, optical lithography, uses light to carve silicon. Below 100 nanometers, however, the wavelengths of the light that is typically employed in chip fabrication become too large to do the job. Not that there is a shortage of possible alternatives to optical lithography. Indeed, every one of the large chip makers has its own favorite candidates. But no one is sure which method will win out. And since it takes several years to get a chip fabrication plant up and running, the clock is ticking.
In the meantime, the large semiconductor manufacturers are worried. “You always jump into these [new] technologies with the best of intentions,” says Intel’s Gargini, “but until you can actually print circuits that you can sell, you never know.”
In December, the companies that make up SIA and Sematech (the consortium of U.S. semiconductor makers) gathered to narrow down the choices. The 120 industry wizards in attendance picked two post-optical strategies for further development. The first choice was a photon-based technique called extreme ultraviolet (EUV) lithography, which is backed by a powerful alliance that includes Intel and Motorola; an electron-beam lithography method called Scalpel, under development by Lucent Technologies, was runner-up. IBM’s candidate-X-ray lithography-came in third.
The decision has no binding effect-the companies can continue to develop any method they please. Indeed, rather than signaling a final decision, the voting points to the intense competition sure to come. “This will be the start of a big shootout over next-generation technologies,” says Don Kania, chief technology officer at Veeco, a leading manufacturer of lithography tools. Adds Intel’s Gargini: “The key thing is that instead of triplicating the effort, we can synchronize.”