Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

In March, the organizers of a computer-security conference called CanSecWest challenged attendees to break into any one of five smart phones, among them Apple’s popular iPhone. The perceived difficulty of the task–especially breaking into the iPhone–meant that few researchers made any attempt to hack the devices, and none succeeded.

Now two researchers hope to make things considerably easier for would-be iPhone hackers. Next month, Charles Miller, a principal analyst at Independent Security Evaluators, and Vincenzo Iozzo, a student at the University of Milan, in Italy, will present a way to run nonapproved code on Apple’s mobile device at the Black Hat Security Conference, in Las Vegas.

Researchers have previously found vulnerabilities in the security of the iPhone; Apple disclosed and issued a patch for a dozen such security holes in the device last November. But it remains tricky to run a nonapproved program once such a flaw has been exploited. Because of the difficulty in running unauthorized code on the iPhone, many security researchers simply refuse to spend much time finding any flaws.

“If you want to attack iPhones, you have to be able to run code to do whatever it is you want to do,” Miller says. “Maybe that is grabbing credentials, maybe it is listening into phone calls, maybe it is turning on the microphone. Who knows? But this all requires that you be able to run code.”

“Charlie found those particular places where changing permissions is allowed on the factory iPhones,” says Sergio Alvarez, a security consultant with Recurity Labs and a fellow iPhone hacker, who is familiar with Miller and Iozzo’s research. “[These parts of the phone] make our lives easier and give us more freedom to code generic and reliable second-stage [attacks].”

The challenge for security researchers and malicious attackers is that Apple restricts the data that can be executed in the iPhone’s memory and requires that programs for the iPhone be cryptographically signed by Apple. Code signing has security benefits, but it is also a way to control which applications run on the iPhone platform.

“In iPhone 1.0, there was very little security built into it,” Miller says. “But when they went to iPhone 2.0–less because they cared about people breaking into phones and more because they wanted to make sure that they wanted to have the App Store and not have people download all sorts of crazy apps–they added a bunch of security.”

But Miller found more than one instance in which Apple failed to prevent unauthorized data from executing. This means that a program can be loaded into memory as a nonexecutable block of data, after which the attacker can essentially flip a programmatic switch and make the data executable.

1 comment. Share your thoughts »

Credit: Technology Review

Tagged: Computing, Communications, Apple, security, iPhone, Black Hat security conference, hacks

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »