Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

This article is part 1 of a two-part series; part 2 will appear on Friday, December 16.

When you can’t make a microprocessor run faster, what do you do? You combine two or more microprocessor cores, of course.

Intel and AMD, the top industry rivals, have already introduced dual-core chips for desktop PCs. And that’s just the start of a trend that could bring an important change to PCs: multicore processing. Both of these leading chipmakers hope to pack four cores into desktop PC chips by 2007. And Intel researchers are investigating how to put tens or even hundreds of cores onto a single chip.

Both chipmakers and PC makers need multicore chips for an important reason: they’ve run out of performance headroom on existing designs. (For years, chipmakers have added transistors and ratcheted up clock speeds to make processors run faster. But clock speeds can be increased only so much before a chip radiates too much heat inside the PC case.)

But why does the average PC user need two, four, or eight cores on a chip? For starters, think multitasking. “I call multitasking the silent ‘killer app’,” says Shane Rau, program manager for semiconductor research at market-research firm IDC. “Today, all the apps we’re using are nickel-and-diming the processor to death.”

Most individual applications already run well – on their own. But, as any Windows user knows, running multiple programs simultaneously, say, a word processor, audio player, and anti-virus software, will eventually make the unwanted hourglass appear. Multicore processing could end that waiting period.

Furthermore, given today’s ever-changing security threats, multitasking requirements will only go up, say industry observers. Most people will continue to use more applications simultaneously, while PCs will need to run more security programs in the background just to protect themselves.

In particular, streaming audio and video tasks can hog microprocessor resources. Intel believes multiple cores will be much better at tasks such as downloading video from a PC to a personal media player. And Intel’s upcoming multicore processors run cooler than the originals – which could lead to innovative notebook and desktop PC case designs. The company’s dual-core chip, nicknamed “Yonah,” is set to debut in early 2006.

At the International Consumer Electronics Show in Las Vegas next month, Taiwanese PC maker AOpen plans to demonstrate a Yonah-powered machine about the size of Apple’s Mac mini-desktop computer, which measures 17 centimeters wide, 17 centimeters deep, and 5 centimeters tall. Yonah played a big role in Apple’s recent decision to buy Intel chips, according to Kevin Krewell, editor-in-chief of In-Stat’s Microprocessor Report, because Yonah will enable faster Powerbook notebooks.

First-Round Mistakes

Early dual-core chips, such as Intel’s Pentium D, got mixed reviews – mainly because their performance gains were unimpressive when running software designed for traditional single-core processors. To really tap into the power of dual-core, or multicore chips, software applications need to be written or rewritten to take advantage of two or more cores, a process called multithreading.

8 comments. Share your thoughts »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me