Skip to Content

Intel, Microsoft Push Parallel Computing

The companies are providing millions of dollars to bolster practical parallel-computing research.
March 19, 2008

Yesterday, in a teleconference, Intel and Microsoft announced that they will start to fund serious efforts to make parallel computing mainstream. The companies are injecting $20 million into computer-science research at the University of California, Berkeley, and at the University of Illinois, Urbana-Champaign, dubbing them the Universal Parallel Computing Research Centers (UPCRC).

For decades, parallel computing–the tricky job of dividing programming tasks among a number of processors–has been the province of academics who write programs for supercomputers. But with the advent of multicore chips–chips with more than one processing center–personal computers are on track to become supercomputers as well.

The main problem outlined at yesterday’s teleconference is that there is still no consensus on the best way to easily program general-purpose chips, like the ones in consumer computers, which have a large number of cores. Historically, programmers have been able to wait for the next generation of chips to come out to make their programs run faster. Now that the next generation of chips includes multiple cores, it’s not as obvious how to divvy up the tasks among the cores to achieve those same gains.

“Programmers have no choice: if they want fast programs, they’re going to have to write parallel programs,” says Dave Patterson, the director of the UPCRC and a computer-science professor at Berkeley.

Although no details were given on specific research projects, Tony Hey of Microsoft Research mentioned some areas of interest, which include developing parallel code libraries with chunks of code that are ready to use, testing different memory approaches, and exploring different types of parallel languages.

Below is a roundup of the most interesting quotes from the teleconference.

“The shift in hardware technology from a single core to multicore will have a profound effect in the way we do programming in the future. We’re really in the midst of a revolution in the computing industry.” –Tony Hey, Microsoft Research

“Parallelism is coming not just to high-end systems, but across a very broad slice of computer science, and we expect to permeate every corner of computer science going forward.” –Andrew Chien, director of Intel Research

“The technology revolution that Andrew [Chien] outlined means that every computer will be a parallel computer. Every problem must be a parallel problem … We want to democratize parallelism.” –Marc Snir, codirector of the UPCRC and a professor of computer science at the University of Illinois

“My perspective here is, there’s not that much that universities haven’t invented [in parallel computing], but there was no strong pull from industry because parallelism wasn’t pervasive; it was a small subset of the industry … Now that the pipeline is unclogged, now that there’s strong interest from Microsoft and Intel … work that’s been going on for many years will become much more fruitful.” –Marc Snir

Keep Reading

Most Popular

open sourcing language models concept
open sourcing language models concept

Meta has built a massive new language AI—and it’s giving it away for free

Facebook’s parent company is inviting researchers to pore over and pick apart the flaws in its version of GPT-3

transplant surgery
transplant surgery

The gene-edited pig heart given to a dying patient was infected with a pig virus

The first transplant of a genetically-modified pig heart into a human may have ended prematurely because of a well-known—and avoidable—risk.

Muhammad bin Salman funds anti-aging research
Muhammad bin Salman funds anti-aging research

Saudi Arabia plans to spend $1 billion a year discovering treatments to slow aging

The oil kingdom fears that its population is aging at an accelerated rate and hopes to test drugs to reverse the problem. First up might be the diabetes drug metformin.

Yann LeCun
Yann LeCun

Yann LeCun has a bold new vision for the future of AI

One of the godfathers of deep learning pulls together old ideas to sketch out a fresh path for AI, but raises as many questions as he answers.

Stay connected

Illustration by Rose WongIllustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.