Intel’s Power Play
A significant shift in the way that many future computers will handle data is being prepared by the world’s biggest microchip maker. On Tuesday, at its Intel Developer Forum (IDF), in San Francisco, the company revealed further details of Nehalem, a more power-efficient chip architecture that will be at the heart of many future products. Intel disclosed power-saving features that promise to let servers, desktop, and laptops run faster without needing more power.
Rajesh Kumar, an Intel fellow and a key architect involved with developing Nehalem, described the tricks used to make the architecture less power hungry. Importantly, a new power-saving control unit on the chip itself has the sole task of monitoring the workload of each of the chip’s individual data-processing units, or “cores.” If only two cores of a four-core machine are active, for instance, the control unit will completely shut down the inactive cores and divert spare power to active ones. The unit can also moderate the speed and power consumption of each core independently.
In addition to moderating the manner in which the cores crunch data, Intel researchers considered the behavior of the transistors within each core. With Nehalem, these are made using so-called 45-nanometer technology. On this scale, the materials used to make the transistors tend to persistently leak electricity, even when they are shut off.
So, to further save power, Intel’s engineers developed a way to shut off transistors when they aren’t in use. “The concept is trivially obvious and has been around for decades,” says Kumar, “but doing it was hard.” It required developing new transistor technology to ensure that the switch had low resistance when it was on but an extremely high resistance when off.
Using the same amount of power, a Nehalem machine can throw more processing cycles at a problem. In simple terms, Intel says, Nehalem will enable high-end desktops to render 3-D animation almost twice as quickly as the fastest chips available today, making video games more realistic and bringing high-quality animation software closer to the masses.
Nehalem has garnered quite a bit of attention from industry analysts since the first details were revealed, in 2007. This is because it’s the first time in more than two decades that Intel has completely overhauled the way that data flows between different components on a chip. The overhaul is necessary because, as engineers add more cores to processors, bandwidth becomes a concern, and it becomes harder to prevent data bottlenecks from reducing performance. “It’s a massive redesign,” says Nathan Brookwood, founder of Insight64, an analyst firm. “It has tremendous implications for Intel and all of Intel’s partners.”
Prior to Nehalem, Intel chips had an external memory controller that moved information between the processing cores and the chips’ memory, where frequently used data is stored. Because the controller was separate from the processors, several cores had to share bandwidth. By integrating the memory controllers into the processors, Nehalem has more than three times as much bandwidth. A similar approach was implemented by rival chip maker AMD in 2003, but Nehalem is Intel’s first chip design with such a feature.
Another performance boost comes from the fact that each processor core can accept twice as much data using a feature called multi-threading. As long as software is written to exploit multi-threading, it can effectively transform a dual-core machine into a quad-core one.
Multi-threading is one of the features that will enable Nehalem machines to run more exciting applications. At IDF, Rick Willardson, Intel’s product marketing engineer for desktop CPUs, showed off the winning entry in a contest to find the best new application for Nehalem–a program that quickly searches photo libraries using an original picture as the query. In the demo, Willardson searched for a picture of two kids on a beach towel with an American flag design using a picture of a flag found online. The demo machine took only a couple of seconds to find hundreds of matching images from a photo library.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.