Intel's New Strategy: Power Efficiency
Spurred by competitor AMD’s rapid success, Intel is shifting its strategy toward more power-efficient microprocessors.
Amid increasing competition from Advanced Micro Devices (AMD), Intel is changing its chip-making philosophy: it’s paying more attention to the power requirements of its microprocessors.
In July 2006, the chip-making giant will release a new microprocessor, called Core 2 Duo, designed for laptops and desktops. The new chip is based on Intel’s current chip architecture, which replaced traditional single-core processing with two processing centers on a single chip. The company says that the Core 2 Duo will perform better than its current dual-core chip, and will be more energy-efficient, which could make laptop batteries last longer and desktop towers run cooler.
Paying attention to power consumption in microprocessors is a relatively new concept for the company, says Steve Pawlowski, a senior fellow at Intel, adding that the move may help Intel regain market share from its rival AMD. Historically, the most important metric in the industry has been processor performance – the speed at which a processor can complete a task, such as calculating a spreadsheet. “We’ve always focused on performance at the expense of power [use],” Pawlowski says.
But basic changes have occurred in the PC market, which first led AMD, and now Intel, to rethink microprocessor designs. First, mobile devices have become the primary PC for many consumers – who don’t want a device that quickly drains a battery or gets too hot. Furthermore, as the size of transistors shrink, they’re more likely to waste electricity through a physical process called “leakage,” says Kevin McGrath, an AMD fellow – and the more transistors on a chip, the more electricity is wasted.
AMD has been working on more-efficient microprocessors for several years, and now Intel is trying to level the playing field. Both Intel and AMD have tackled part of the problem by converting their chip line-ups to dual-core processors (see “Multicore Mania,” December 2005), which turns out to be one way to increase efficiency. “Interestingly, going to multiple cores can be a very power-efficient way of computation,” says Milo Martin, professor in the computer and information sciences department at the University of Pennsylvania.
Three aspects of multicore chips make them more efficient. First, when a chip has more than one core, the speed at which each core computes can be slowed down without impeding the speed of the entire chip. By slowing down the clock speed, explains Martin, engineers can decrease the computational rate of a single core by a factor of five, from one gigahertz to 200 megahertz, and the core consumes only one-30th of the power. Then, he says, even if five of those cores are assembled onto a single chip, only one-sixth of the power is consumed, yet the total computational rate of one gigahertz is maintained.
Second, smaller processor sizes reduce power consumption. The number of transistors each core has and the amount of silicon real-estate they take up determines the amount of power the core uses – smaller processors have fewer transistors and thus use less power than larger processors. In a dual-core chip, the total number of transistors is greater than it is in a single-core chip, but each core has fewer transistors, making it more power efficient.
Third, some of the processor functions, such as controlling memory, can be shared between cores, so that each core consumes less energy by not performing a redundant task.
So transitioning to a multicore architecture is an obvious way to save power, and both Intel and AMD have done so. But they’re looking at other ways to create efficiency. As Pawlowski explains, managing processors at the circuit and individual transistor level can also save power. For instance, specific circuits on a transistor are designated to control the manipulation of a photo or to play a DVD. When that circuit needs to be used, the transistors that comprise the circuit are turned on with a certain voltage. In a perfectly efficient chip, those transistors would turn on and off only when they’re needed. However, even when a circuit is idle, its transistors are using a small voltage that slowly leaks out of the transistor, says Pawlowski. This leakage produces heat and wastes electricity.
While there is much overlap in the ways that AMD and Intel are approaching this problem of waste and leakage at the circuit level, their solutions are different. Intel is working to solve the problem by designating “sleep transistors” on a chip to micromanage the circuits in each core. These transistors completely turn off the voltage to transistors in circuits that are dormant.
AMD also puts portions of the processor to sleep, explains McGrath; but it does so by having an algorithm instruct the processor to go into various levels of sleep, by shutting down its clock speed so that standby computations aren’t carried out as quickly. The algorithm “can ask a part to go into its lowest power state,” he says, “there are five or six of these power states that are used depending on the load of the processor.”
Intel has announced prices for its new energy-efficient chips – they’re less expensive than AMD’s current offerings, which will put pressure on its rival. For Intel, though, the test of whether its power-saving chips can compete well against AMD’s offerings won’t come until its new processors hit the market.
AI is here.
Own what happens next at EmTech Digital 2019.