Moore’s Law Lives Another Day
“[Gordon] Moore is my boss, and if your boss makes a law, then you’d better follow it,” says Mark Bohr, who leads Intel’s efforts to make advances in microchip design practical to manufacture. Moore’s Law, of course, was first proposed by Bohr’s boss in 1965, when Moore pointed out that the number of transistors on a chip doubles every year. The current form of Moore’s law has been set since 1975, when Moore altered the pace to a doubling every two years. Remarkably, the computer industry has maintained that pace ever since, training us to expect computers to become ever faster in the process.
After Monday’s launch of Intel’s newest line of processors, named Ivy Bridge, Moore’s prediction is still looking sound. The chips are the first to become available from any company with features as small as 22 nanometers (the finest details on today’s chips are 32 nanometers), allowing transistors to be smaller and packed more densely. Ivy Bridge chips offer 37 percent more processing speed than the previous generation of chips, and can match their performance while using just half the energy.
Transistors on an Ivy Bridge processor are packed roughly twice as densely as in the most recent line of Intel chips, with 1.4 billion on a 160 square millimeter die instead of 1.16 billion on a 212 square millimeter die. Upholding Moore’s Law like that required a significant redesign of the transistor, the tiny electronic switches that make up digital computer chips. Existing transistor designs—little changed in decades—could not simply be made smaller, with 22-nanometer features. That would cause them to become leaky, so that a transistor would allow some current to flow even when set to off. Intel got around that by adding an extra dimension to transistors, which for decades have been made as a stack of flat layers of material on top of one another.
A transistor’s basic design comprises separate electrodes for incoming and outgoing current, known as the source and drain; material connecting the two, known as the channel; and a third electrode known as the gate, which controls the flow of current. Rather than being a flat layer, the channel of Intel’s reinvented transistors is a long “fin” that protrudes up into the gate electrode above, creating a more intimate electrical connection between the layers. Intel refers to its three-dimensional transistors as having a “tri-gate” design.
Similar designs were first suggested in Japan in the 1980s, and developed for many years at the University of California, Berkeley, starting in the 1990s. Intel started investigating the design around 2000, says Bohr, and in 2008 committed to using it. “It’s one thing to make a lab device, but a very different thing to make sure it can produce chips at low cost and high volume,” says Bohr. He says Intel is reusing many existing factory processes, and, as a result, patterning a silicon wafer with Ivy Bridge designs costs only around 2 percent more than it did for Intel’s previous generation of chips.
Intel’s launch of desktop Ivy Bridge chips this week leaves it technologically ahead of its competitor, AMD, which doesn’t have public plans to adopt three-dimensional transistors or use 22-nanometer technology. Versions of the new technology for laptops are due in the summer, but more important to Intel may be the potential for Ivy Bridge chips to help it break into the market for energy-efficient processors needed for tablets and smart phones.
Intel’s three-dimensional transistors will debut in the company’s Atom line of mobile processors in 2013. Intel wants those to be used in smart phones and tablets, and has signed deals with Lenovo and Motorola to do so.
As for the future prospects for Moore’s Law, Bohr says that his group is already working on manufacturing processes for a version of the three-dimensional transistors with 14-nanometer features, scheduled for production in 2014. “It’s becoming more challenging, but I don’t see the end [to Moore’s Law],” says Bohr.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.