How Games Are Driving a Mobile Graphics Revolution
Since Apple opened its App Store in 2008, catering to the needs of gamers has been increasingly important for mobile-device makers. While the iPhone was not designed primarily for games, they soon dominated the best-selling app charts, a pattern that was duplicated on Android devices and looks set to repeat with Windows phones. Qualcomm, a major manufacturer of chipsets for mobile devices, estimates that 60 percent of smart-phone users regularly play games on the devices.

Consequently, chip makers have been competing to provide mobile-device manufacturers with better and better graphics capabilities by means of dedicated processors that are now among the devices’ most complex and powerful subsystems. Painting hundreds of thousands of pixels at a time, these graphics processors don’t just display two-dimensional icons, pictures, and video but can render the complex 3-D environments of many modern video games—calculating, for example, how a sunbeam will reflect off a tattered flag as it flutters in a breeze.
Qualcomm spent $65 million in 2009 to buy the handset graphics operations of Advanced Micro Devices, which were originally part of ATI Technologies, an early leader in graphics processors for personal computers. ARM, which designs the general-purpose processor cores that power most of the world’s smart phones, has been placing increasing emphasis on its Mali family of graphics processors, the first versions of which were announced in 2007. The overarching importance of graphics has even allowed Nvidia, which created the first commercial graphics processing unit in 1999 for the personal-computer games market, to enter the market with an eight-core graphics processor and a dual-core general-purpose processor bundled on the same chip. “The catalyst was Apple’s iPhone,” says Matt Wuebbling, director of product marketing for Nvidia. “It showcased a mobile device that is purely display based.” That is, it relies on a graphical interface for all interactions with the user.
One of the biggest challenges companies like Qualcomm and Nvidia faced as they developed graphics processors for mobile devices was to provide advanced processing without draining a battery within 20 minutes. While graphics cards for personal computers often required beefed-up power supplies and cooling fans, Qualcomm calculates that its latest chips render a scene nearly as complex as those found in desktop computer games but use less than 1 percent as much energy as a desktop graphics processor.
See the rest of our Business Impact report on The Business of Games.
Because the mobile market is so huge, chip makers can make investments that they couldn’t afford for smaller targets. Market researcher iSuppli says that last year 295 million smart-phone handsets were shipped, compared with 27.2 million dedicated handheld gaming devices such as the Nintendo DS. The result has been a positive feedback loop: better hardware leads to more advanced games, which in turn stoke the demand for better hardware. Progress has been so rapid that mobile devices are moving toward the kind of graphic performance normally associated with video-game consoles and high-end PC systems. For example, last spring Sony Ericcson introduced the Xperia Play phone, which is capable of playing games designed for the original PlayStation console, and Apple says the iPhone4S contains a graphics processor from Imagination Technologies that is seven times faster than that of the iPhone 4.
Indeed, as mobile graphics power increases, growing numbers of users are likely to play some of their favorite mobile games on their big-screen TVs, by hooking up the handsets through a cable. In terms of the graphic capabilities available, says Wuebbling, “on the mobile side, for gaming, we’re where PCs were 10 years ago.”
Keep Reading
Most Popular
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.