Moore’s Law Over, Supercomputing "In Triage," Says Expert
High Performance Computing expert Thomas Sterling would like you to know that a computing goal you’ve never heard of will probably never be reached. The reason you should care is that it means the end of Moore’s Law, which says that roughly every 18 months the amount of computing you get for a buck doubles.

Or at least, the end of Moore’s Law-style advances in the processing power of the world’s biggest supercomputers. For a while now, every 11 years or so, the planet’s smartest and best-funded computer scientists have managed to produce a supercomputer that’s 1,000 times faster than its predecessor. In 1999, we reached teraflops-scale computing, or a trillion (10^12) floating point operations per second. In 2008, Los Alamos’ Roadrunner supercomputer reached petascale computing, or a quadrillion (10^15) floating point operations per second.
In a mind-blowingly jargon-rich interview with HPC Wire, Sterling doesn’t just assert that Zeta-Scale (10^21 FLOPS) computing is impossible, he also makes it seem pretty unlikely we’re going to reach the next milestone, Exascale computing (10^18 FLOPS) without ripping apart our existing ways of building supercomputers, root and branch. Emphasis mine:
“[I]ndustry will deliver the systems that will be used in the next decade. There is no other choice. It is clear that vendors would prefer not to have to retool and this is true for users as well. To do so will involve a degree of disruption that would be best avoided if it were possible. And for a portion of the overall workload, even at exascale, this may prove to be possible. But such systems are a placebo to an ailing HPC community that if not in triage, is already showing symptoms of underlying conditions that require attention.”
If you read the whole interview, basically, Sterling is showing that we’re not going to reach the next supercomputing milestone with more incremental improvements on existing systems, which is how we reached the last two milestones. Indeed, he says that without “innovative ways of managing vertical and lateral data movement,” current estimates posit that future exascale machines will use roughly ten times more power than is considered feasible.
If Sterling is right, and he is one of the deans of high performance computing, it seems likely that we’ll never reach the next milestone in supercomputing in silicon. Physicist Michio Kaku says Moore’s Law will collapse in about ten years. Sterling agrees, and says it comes down to the basic physical restrictions of working with atoms.
What will the supercomputers of the future be made from, then?
Kaku mentions machines based on protein, DNA, and optical devices as possible replacements. When the time comes to transition to a new medium, he thinks the world will migrate to 3-dimensional chips. That technology would be followed by molecular computers and, eventually, by quantum computers around the end of the 21st century.
Keep Reading
Most Popular
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.