It takes energy to run the computers inside data centers—and then more energy to cool them down. With demand for cloud computing growing rapidly, the companies that run these centers are looking for ways to save on energy costs. The microprocessors inside their computers look to be an ideal target.
For years, Intel and AMD have dominated the microprocessor market with high-performance chips. But as the cost of cooling chips becomes a bigger issue, these companies will face competition from low-power upstarts, some of which use chip architectures originally developed for cell phones and other mobile devices.
The ARM chip design—licensed by a company called ARM, based in Cambridge, U.K.—originates from a battery-constrained environment, which means that it is inherently low-power. The design is relatively simple and trades processing power for energy savings. Unlike Intel and AMD chips, ARM chip designs can be modified by other companies and optimized for specific tasks. ARM chips also integrate components that are normally found elsewhere in a server into one chip, an approach that saves space and cost.
ARM chips are also already produced in greater numbers than Intel and AMD chips. In the long term, this could mean greater innovation and lower production costs, because competition among different producers may drive better designs and production methods.
To compensate for the loss of performance compared to high-performance chips, a company would need to use more ARM chips for more taxing applications.
“Producers of these ARM chips don’t have any secret sauce that gets them around the laws of physics,” says Tom Halfhill, industry analyst and editor of Microprocessor Report. “They’re not talking a whole lot right now about how much power their chips are really going to save, but the basic fact is that performance costs power.”
Viren Shah, senior director of Marvell’s enterprise business unit, says that the chips are best used in systems where networking is the processing bottleneck. Good examples of this would be Web servers and cloud-computing applications, where simple processing tasks can be distributed across a network.
When designing an embedded system choosing which tools to use often comes down to building a custom solution or buying off-the-shelf tools.