Edit

Computing

What's Inside the iPad's Chip?

Cost and power efficiency may have pushed Apple to create its own microchip.

Despite widespread speculation, nothing beyond what Steve Jobs announced last week is known about the A4 chip at the heart of the Apple iPad.

Chip in: The iPad’s A4 is Apple’s first homemade chip.

Jobs described the chip with typical restraint during the unveiling of the iPad. “It’s powered by our own silicon–the one gigahertz Apple A4 chip–it screams,” he said, adding that the A4 chip includes an integrated CPU and graphics core on a single system on a chip (SoC).

Soon after the announcement, experts began speculating that the chip was based on the same ARM architecture as the iPhone and iPod touch.

“No official source that I can find has confirmed that the A4 uses ARM,” says Tom Halfhill, senior analyst at Microprocessor Report. However, he says, it’s logical to assume that the iPad is using a processor based on the ARM architecture. “It makes sense, [because] Apple wouldn’t have to port the iPhone OS to a new CPU architecture.”

Some have suggested that the chip may be based on the latest and fastest ARM designs, but both the slightly older and slower ARM Cortex 8 and the newer ARM Cortex 9 cores can run at a clock speed of one gigahertz, notes Halfhill. Boosting the speed of an ARM Cortex 8 core–the core thought to run in the Samsung-built chip that powers the iPhone 3GS–to one gigahertz would be possible because the iPad has more room for batteries, allowing engineers to drive the A4 at a higher voltage and therefore clock frequency.

Gene Munster, a senior research analyst at Piper Jaffray, says that Apple might have felt the need to develop its own chip for a simple reason. “One reason Apple did this is because they’re saving money on the chip,” says Munster. “On an iPhone, a Samsung chip is $15–it’s the third most expensive piece of the phone. Going from $15 to $5 doesn’t sound like much, but if you multiply it over 15 million devices, it adds up.”

Raw speed has been cited as another reason for Apple to move to a new chip, but Munster doesn’t buy it–not with companies like NVidia and Qualcomm offering similarly powerful designs for netbooks and other devices. “I just can’t imagine Apple being able to build something themselves that’s better than these companies,” he says.

A more likely technical reason for Apple’s custom silicon, Munster argues, is the need to keep power consumption to a minimum. “They could create something that’s not as fast, but might be better at power consumption,” he says. “If you look at the battery life they’re talking about, the tablet is bigger than the iPhone but it seems like they’ve done a better job with battery life.”

The A4’s graphics core might also use the ARM architecture, but this would require on-the-fly translation of code for existing iPhone applications. Since “almost all” existing iPhone applications will run on the iPad, it’s more likely that Apple is continuing to use upgraded versions of the same graphics cores present in the iPhone and iPhone 3GS, which were created from designs licensed by Imagination Technologies, based in the U.K.

Representatives of Imagination refused to discuss whether or not the A4 SoC uses an Imagination core. But Apple owns just under 10 percent of the company and all iPhone and iPhone touch models use Imagination’s PowerVR MBC family of graphics cores. Imagination also recently confirmed that the iPhone 3GS uses the upgraded PowerVR SGX design. If the iPad continues this trend, it could take advantage of features of the Imagination graphics core that are uniquely well-suited to driving a screen as large as the one on the iPad.

For instance, Imagination uses so-called “tile-based deferred rendering,” which helps drive a faster user interface. “You split a screen into little tile zones,” says Kristof Beets, manager of business development for graphics at Imagination. This allows a chip’s graphics cores to compute individual tiles of the screen–say, 32 by 32 pixels on an 800 by 480 screen, with data stored in on-chip caches. By avoiding the step where a full-screen renderer has to access RAM, the chip can render a screen full of images much faster.

A second feature of Imagination’s technology that may be relevant is “deferred rendering.” Normally, a 3-D algorithm will compute the location data of a given object after computing its shape and the lighting effects applied to it. This means that where pixels on a screen correspond to objects that are blocked by other objects, some of that computation is wasted. The same is true for objects in windows layered one on top of the other in a desktop environment. Imagination’s chips, in contrast, compute the location data first, minimizing the number of computations that must be made and allowing for lower power consumption.

In April 2008, Apple acquired P.A. Semi, a chip manufacturer that specialized in power-efficient processors that use the PowerPC architecture–the same architecture used by Apple in its computers until it switched over to Intel CPUs in 2006.

“Some of [P.A. Semi’s] engineers had ARM experience, and, of course, their chip-design knowledge would be transferrable to any CPU architecture,” says Halfhill. “A highly integrated SoC like the Apple A4 would take at least 12 to 18 months to design, debug, and manufacture, however, making it unlikely that P.A. Semi engineers designed it from scratch.”

In Halfhill’s view, this makes it even more likely that the A4 chip is made primarily of designs that closely match existing ARM cores. “Apple would have had to move awfully fast to design its own ARM-compatible core and the A4 SoC in so short a time,” he says. “That’s why I think the A4 is built on existing cores from ARM.”

Halfhill suggests that P.A. Semi engineers may have been brought on board for some project other than the A4 chip. “I wouldn’t be surprised if many or most of the P.A. Semi engineers were assigned to another project–such as a future Apple A5 chip,” he says.

Uh oh–you've read all five of your free articles for this month.

Insider Online Only

$19.95/yr US PRICE

Computing

From the latest smartphones to advances in quantum computing, the hardware behind today's digital age is rapidly changing.

You've read of free articles this month.