Moore’s Law Lives
Intel just announced a chip that the company says has them right on track to keep alive Moore’s Law, the 1965 prediction that the number of transistors on a chip will double about every two years. Indeed, the company has just manufactured a test chip that packs in approximately two times the number of transistors as the previous version.
[Click here for an image of the test chip.]
This new generation of test chip -– dubbed “45 nanometers” because of the size of the circuit features -– contains more than one billion transistors. Using this manufacturing process, Intel could fabricate microprocessors with either double the processing power or half the chip size but the same speed and power as the previous 65 nanometer chips.
The 45 nanometer test chip was manufactured as a proof of principle. Traditionally, says Mark Bohr, a senior fellow in Intel’s technology and manufacturing group, the entire microprocessor is ready to go about a year and a half after the test chip is announced; since the preceding 65 nanometer microprocessors began shipping in October 2005, the 45 nanometer technology is right on track.
So will average consumers feel the difference in the ever-increasing number of transistors in their computers? One place where they might is video. Media is migrating from television sets to computers, and that’s one trend where such chip advances will matter, says Nathan Brookwood, analyst at Insight 64. “Everyone’s going to want little home servers that can download movies and stream over high-speed wireless networks,” he says.
Although dual-core processors, in which two processing centers are combined onto a single chip, such as those made by AMD and Intel, are fast, they still don’t have the processing power to easily handle multiple, complicated media functions. Transferring a movie from a DVD to a portable video player, for instance (a file conversion process called “transcoding”), chews up a significant amount of time and power because of complicated digital rights management software, Brookwood points out.
Indeed, transcoding using even a dual-core processor can take up to 30 seconds for every minute of video. Computers with more than two cores per chip could speed up this task. And because transistors are shrinking, the amount of chip real-estate that a processor sits on is also decreasing. This means more room for more cores that can handle process-hungry tasks such as transcoding. “As these kinds of digital realities creep into our lifestyle, people will need more processing power, even though they don’t know it,” says Brookwood.
To make the 45 nanometer chip, Intel engineers shined ultraviolet light through a glass and chrome mask, which is a stencil that outlines where the features will be made. Because the wavelength of light used was 193 nanometers – much larger than the chip features – engineers had to tweak the mask design to compensate for the blurriness that can occur when the wavelength of light is larger than the mask features it passes through.
But these tricks have their limitations. As chip components become even smaller, new tactics will have to be explored. One option is extreme ultraviolet lithography, in which the wavelength of light is just 13.5 nanometers. This would allow features below 10 nanometers to be crafted. Bohr says that Intel is “still exploring different options” for photolithography to make their future chips at 32 nanometers, 22 nanometers, or smaller, and one option may include extreme ultraviolet lithography.
With Moore’s Law operating in extended lifetime mode, how long can chip makers keep up the pace? Dennis Buss, vice president of silicon technology development at Texas Instruments, suspects that the technology could go on until 2015 or 2020, when 10 nanometer chip features are predicted. At that point, he says “scaling as we know it will definitely stagnate.”
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.