John Pavlus

A View from John Pavlus

How Your Retina Screen is Helping Make Supercomputers More Powerful than Ever

Consumers’ relentless demand for better user experiences helps create the technology that drives high-performance computing forward.

  • November 15, 2012

Earlier this week, the “Titan” supercomputer at Oak Ridge National Laboratory in Tennessee was named the fastest supercomputer on earth. It was able to perform nearly 18 quadrillion floating-point calculations per second in the LINPACK benchmark (the high-performance computing industry’s standard “speedometer”) by accelerating its 560,640 CPUs with graphics processing units from NVIDIA. Intriguingly, this same “Kepler” GPU architecture also provides the graphics horsepower for Retina screens on the new Macbook Pro. This isn’t a coincidence. Without ordinary consumers’ relentless desire for next-generation user experiences – sharper screens for their laptops, better graphics for their games, longer-lasting batteries for their mobile devices – scientific supercomputing would be kind of screwed.

A Cray-2 supercomputer from 1985. An iPad 2 has an equivalent amount of processing power.

“It takes about a billion dollars to develop a high-performance processor,” says Steve Scott, Tesla CTO at NVIDIA. “The supercomputing market isn’t big enough to support the development of this hardware. Fortunately, NVIDIA is supported by millions and millions of gamers that want ever-faster processing power for their gaming. We can take the same processors that are designed to run graphics for video games and use them to perform the calculations necessary to simulate the climate or design more fuel efficient engines.”

This story is part of our May/June 2010 Issue
See the rest of the issue
Subscribe

GPUs have been used to accelerate supercomputers for years. Before that, consumer off-the-shelf (COTS) chips revolutionized supercomputing by replacing the expensive custom CPUs that had made firms like Cray famous. In most areas of science, innovation trickles down from governments and academia to industry and consumer applications. But as high-performance computation rises to meet theory and experimentation as a “third pillar” of scientific discovery, the reverse is true. Legions of Skyrim-obsessed gamers and iPad-toting moms are the engine that drives technical innovation upward to scientists and researchers.

And what motivates all those gamers and moms to keep obsessing and desiring and upgrading and buying, year after year? User experience. Apple doesn’t develop Retina screens (and NVIDIA doesn’t create the GPUs to power them) “just because.” When Apple billed its Power Mac G4 as the first “desktop supercomputer” (because it could perform more than a billion floating-point calculations per second) more than a decade ago, it wasn’t because they wanted computational scientists to high-five them. It was because a gigaflop was enough computing power to ensure (at the time) a seamless, on-demand user experience for ordinary consumers. 

The next big milestone for supercomputing to reach is the so-called exascale, where computers can execute 1018 calculations per second– a thousand times more powerful than today’s petascale machines like Titan, and enough (according to some researchers) to make magical feats of simulation possible, like screening potential drug designs against every known living protein class so that any possible side effect can be predetermined.

But the barrier to reaching the exascale isn’t processing power, it’s energy usage. And so once again, ordinary user experience may be what takes us to the next generation of supercomputing, as scientists and engineers experiment with using ultra-low-power mobile chip architectures such as ARM to wring ever more FLOPS per watt out of their machines. After all, the main reason the chip in your phone is much more energy-efficient than the one in your desktop is to ensure that your battery lasts more than 15 minutes. (And to make sure the phone doesn’t burn through your pants. That would be a pretty terrible user experience.)

Supercomputers themselves offer a ruthlessly primitive user experience: researchers often code their applications themselves using Fortran, and interact with them using the command line. As they should–none of those petaflops should be “wasted” on user-interface frivolities that the rest of us take for granted. But it’s those very frivolities–and our voracious desire for faster, stronger, better, prettier ones every year–that make supercomputing possible. 

Want to go ad free? No ad blockers needed.

Become an Insider
Already an Insider? Log in.

Uh oh–you've read all of your free articles for this month.

Insider Premium
$179.95/yr US PRICE

More from Intelligent Machines

Artificial intelligence and robots are transforming how we work and live.

Want more award-winning journalism? Subscribe to Insider Premium.
  • Insider Premium {! insider.prices.premium !}*

    {! insider.display.menuOptionsLabel !}

    Our award winning magazine, unlimited access to our story archive, special discounts to MIT Technology Review Events, and exclusive content.

    See details+

    What's Included

    Bimonthly home delivery and unlimited 24/7 access to MIT Technology Review’s website.

    The Download. Our daily newsletter of what's important in technology and innovation.

    Access to the Magazine archive. Over 24,000 articles going back to 1899 at your fingertips.

    Special Discounts to select partner offerings

    Discount to MIT Technology Review events

    Ad-free web experience

    First Look. Exclusive early access to stories.

    Insider Conversations. Listen in as our editors talk to innovators from around the world.

/
You've read all of your free articles this month. This is your last free article this month. You've read of free articles this month. or  for unlimited online access.