Skip to Content
Computing

Moore’s Law Is Dead. Now What?

Shrinking transistors have powered 50 years of advances in computing—but now other ways must be found to make computers more capable.

Mobile apps, video games, spreadsheets, and accurate weather forecasts: that’s just a sampling of the life-changing things made possible by the reliable, exponential growth in the power of computer chips over the past five decades.

But in a few years technology companies may have to work harder to bring us advanced new use cases for computers. The continual cramming of more silicon transistors onto chips, known as Moore’s Law, has been the feedstock of exuberant innovation in computing. Now it looks to be slowing to a halt.

“We have to ask, is this going to be a problem for areas like mobile devices, data centers, and self-driving cars?” says Thomas Wenisch, an assistant professor at the University of Michigan. “I think yes, but on different timescales.”

Moore’s Law is named after Intel cofounder Gordon Moore. He observed in 1965 that transistors were shrinking so fast that every year twice as many could fit onto a chip, and in 1975 adjusted the pace to a doubling every two years.

The chip industry has kept Moore’s prediction alive, with Intel leading the charge. And computing companies have found plenty to do with the continual supply of extra transistors. But Intel pushed back its next transistor technology, with features as small as 10 nanometers, from 2016 to late 2017. The company has also decided to increase the time between future generations (see “Intel Puts the Brakes on Moore’s Law”). And a technology roadmap for Moore’s Law maintained by an industry group, including the world’s largest chip makers, is being scrapped. Intel has suggested silicon transistors can only keep shrinking for another five years.

The computers in our pockets will probably feel the effects later than other types of computing devices, Wenisch guesses. Mobile devices are powered by chips made by companies other than Intel, and they've generally been slightly behind in transistor technology. And mobile processors don’t make full use of some design techniques well established in more powerful processors for non-roving machines, he says.

“You probably have a generation or two more runway in mobile,” says Wenisch.

However, many useful things that mobile devices can do rest on the power of billion-dollar data centers, where the end of Moore’s Law would be a more immediate headache. Companies such as Google and Microsoft eagerly gobble up every new generation of the most advanced chips, packed more densely with transistors.

Wenisch says companies such as Intel, which dominates the server chip market, and their largest customers will have to get creative. Alternative ways to get more computing power include working harder to improve the design of chips and making chips specialized to accelerate particular crucial algorithms.

Strong demand for silicon tuned for algebra that’s crucial to a powerful machine-learning technique called deep learning seems inevitable, for example. Graphics chip company Nvidia and several startups are already moving in that direction (see “A $2 Billion Chip to Accelerate Artificial Intelligence”).

Microsoft and Intel are also working on the idea of running some code on reconfigurable chips called FPGAs for greater efficiency (see “Microsoft Says Reprogrammable Chips Will Make AI Smarter”). Intel spent nearly $17 billion to acquire leading FPGA manufacturer Altera last year and is adapting its technology to data centers.

Horst Simon, deputy director of Lawrence Berkeley National Laboratory, says the world’s most powerful calculating machines appear to be already feeling the effects of Moore’s Law’s end times. The world’s top supercomputers aren’t getting better at the rate they used to.

“For the last three years we’ve seen a kind of stagnation,” says Simon. That’s bad news for research programs reliant on supercomputers, such as efforts to understand climate change, develop new materials for batteries and superconductors, and improve drug design.

Simon says the coming plateau in transistor density will stir more interest in redrawing the basic architecture of computers among supercomputer and data-center designers. Getting rid of certain design features dating from the 1940s could unlock huge efficiency gains (see “Machine Dreams”). Yet taking advantage of those would require rethinking the design of many types of software, and would require programmers to change their habits.

Whatever kind of computer you’re interested in, the key question is whether the creative avenues left open to computing companies can provide similar payoffs to Moore’s Law after it ends, says Neil Thompson, an assistant professor at MIT Sloan School. “We know that those other things matter, but the question is, are they of the same scale?” he says.

One reason to think they might not be is that companies will have to work together in new and complicated ways, without the common heartbeat that used to keep the industry’s product and R&D plans in sync.

“One of the biggest benefits of Moore's Law is as a coӧrdination device,” says Thompson. “I know that in two years we can count on this amount of power and that I can develop this functionality—and if you’re Intel you know that people are developing for that and that there's going to be a market for a new chip.”

Without that common music to dance to, advances in computing power that benefit all kinds of companies, not just ones with mutually strong incentives to collaborate, could be less common.

Deep Dive

Computing

Inside the hunt for new physics at the world’s largest particle collider

The Large Hadron Collider hasn’t seen any new particles since the discovery of the Higgs boson in 2012. Here’s what researchers are trying to do about it.

Why China is betting big on chiplets

By connecting several less-advanced chips into one, Chinese companies could circumvent the sanctions set by the US government.

How Wi-Fi sensing became usable tech

After a decade of obscurity, the technology is being used to track people’s movements.

Algorithms are everywhere

Three new books warn against turning into the person the algorithm thinks you are.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.