Skip to Content

Intel, Microsoft Push Parallel Computing

The companies are providing millions of dollars to bolster practical parallel-computing research.
March 19, 2008

Yesterday, in a teleconference, Intel and Microsoft announced that they will start to fund serious efforts to make parallel computing mainstream. The companies are injecting $20 million into computer-science research at the University of California, Berkeley, and at the University of Illinois, Urbana-Champaign, dubbing them the Universal Parallel Computing Research Centers (UPCRC).

For decades, parallel computing–the tricky job of dividing programming tasks among a number of processors–has been the province of academics who write programs for supercomputers. But with the advent of multicore chips–chips with more than one processing center–personal computers are on track to become supercomputers as well.

The main problem outlined at yesterday’s teleconference is that there is still no consensus on the best way to easily program general-purpose chips, like the ones in consumer computers, which have a large number of cores. Historically, programmers have been able to wait for the next generation of chips to come out to make their programs run faster. Now that the next generation of chips includes multiple cores, it’s not as obvious how to divvy up the tasks among the cores to achieve those same gains.

“Programmers have no choice: if they want fast programs, they’re going to have to write parallel programs,” says Dave Patterson, the director of the UPCRC and a computer-science professor at Berkeley.

Although no details were given on specific research projects, Tony Hey of Microsoft Research mentioned some areas of interest, which include developing parallel code libraries with chunks of code that are ready to use, testing different memory approaches, and exploring different types of parallel languages.

Below is a roundup of the most interesting quotes from the teleconference.

“The shift in hardware technology from a single core to multicore will have a profound effect in the way we do programming in the future. We’re really in the midst of a revolution in the computing industry.” –Tony Hey, Microsoft Research

“Parallelism is coming not just to high-end systems, but across a very broad slice of computer science, and we expect to permeate every corner of computer science going forward.” –Andrew Chien, director of Intel Research

“The technology revolution that Andrew [Chien] outlined means that every computer will be a parallel computer. Every problem must be a parallel problem … We want to democratize parallelism.” –Marc Snir, codirector of the UPCRC and a professor of computer science at the University of Illinois

“My perspective here is, there’s not that much that universities haven’t invented [in parallel computing], but there was no strong pull from industry because parallelism wasn’t pervasive; it was a small subset of the industry … Now that the pipeline is unclogged, now that there’s strong interest from Microsoft and Intel … work that’s been going on for many years will become much more fruitful.” –Marc Snir

Keep Reading

Most Popular

Russian servicemen take part in a military drills
Russian servicemen take part in a military drills

How a Russian cyberwar in Ukraine could ripple out globally

Soldiers and tanks may care about national borders. Cyber doesn't.

Death and Jeff Bezos
Death and Jeff Bezos

Meet Altos Labs, Silicon Valley’s latest wild bet on living forever

Funders of a deep-pocketed new "rejuvenation" startup are said to include Jeff Bezos and Yuri Milner.

conceptual illustration showing various women's faces being scanned
conceptual illustration showing various women's faces being scanned

A horrifying new AI app swaps women into porn videos with a click

Deepfake researchers have long feared the day this would arrive.

ai learning to multitask concept
ai learning to multitask concept

Meta’s new learning algorithm can teach AI to multi-task

The single technique for teaching neural networks multiple skills is a step towards general-purpose AI.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.