Skip to Content

Chip Saves Power by Fudging the Figures

Chips that save energy by approximating some calculations could allow mobile devices to be smarter at understanding the world.
December 19, 2013

We owe our smartphones and supercomputers to the mathematicians and engineers who figured out in the 1940s and 1950s how to create machines that can crunch numbers at high speed with perfect accuracy. Some researchers are now going back on that principle by working on designs that sacrifice accuracy for power efficiency. The approach, known as approximate computing, could extend the battery life of mobile devices and enable advanced techniques such as computer vision.

Researchers from Purdue University reported last week on tests of a simple processor that uses approximation. The researchers were able to cut the chip’s energy consumption in half by allowing error to creep into some operations as it ran a range of software for tasks such as recognizing handwriting or detecting eyes in images.

Other researchers, from the University of Washington, have shown that the energy consumption of flash memory, used in mobile computers such as phones, could be cut if chips were allowed to store non-critical data imperfectly. Both groups presented their work at the Micro conference at the University of California, Davis.

Approximate computing has been researched for years but has now advanced to the point where it is possible to build real systems using the technique, says Anand Raghunathan, a professor at Purdue University who in 2006 was named by MIT Technology Review as one of the 35 Innovators under 35 (see “Making Mobile Secure”). “We have proof in working silicon that this can actually be done,” he says.

The timing is good, because although complete accuracy will always be needed for a lot of jobs, such as calculating paychecks, many of the advanced tasks being asked of computers, such as recognizing images or reproducing sound, can tolerate some sloppiness.

“For more and more computers, whether in phones or data centers, the end result is not a precise numerical value, it’s something meant for humans,” says Raghunathan. “The calculations involved in these apps don’t need to be treated as all sacred or precise—we can exploit that forgiving nature.” When a computer tries to recommend a movie or recognize your friend in a photo, for example, approximating some of the numbers used along the way is fine as long as the final answer is correct.

Allowing computers to approximate can save energy in a variety of ways, mostly by removing quality controls on the manipulation of electronic signals. Purdue’s processor design, dubbed Quora, saves energy by scaling back the precision used to express certain values it operates on, which allows some of its circuit elements to remain idle. It also dials down the voltage to some circuit elements when they work on approximated data. Crucially, the design doesn’t do that for every instruction a piece of software directs it to carry out. Instead, it looks for signals written into a program’s code indicating which parts of it are tolerant to some error and by how much.

Being able to specify the degree of noise acceptable for different parts of a program makes it possible to use approximation without overloading it with errors, says Swagath Venkataramani, the Purdue researcher who led work on the processor. He predicts that descendants of Quora will appear in commercial products as co-processors to conventional processors; such co-processors could take on tasks such as image processing that benefit from approximation. “As we have demonstrated, this includes recognition, data mining, search, and vision—applications that are growing extremely popular across the computing spectrum.”

Luis Ceze, an associate professor at the University of Washington, says the Purdue work shows that chips that approximate can be practical. However, Ceze says, it may be better to have chip hardware play a less active role in determining where to apply approximation, and use software instead. That could make it easier to automatically translate software written for conventional computers into a form that could be handled by a system that would use approximation, he says. However, Ceze acknowledges that the field is far from establishing a single way of doing things. “This area is very much in an exploration phase,” he says.

Ceze doesn’t doubt that the ability to approximate will likely make it into commercial computing devices. His own group has begun talking with flash memory companies about a technique it developed that saves energy by cramming more bits into memory blocks than usual, only marginally degrading the stored data.

Consumers are playing a major part in driving the industry’s openness to such ideas, says Ceze. “We have a lot of data these days and a lot of it is approximable in nature, things like images, sound, video data from sensors.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.