Akamai’s New CEO Aims to Speed Up Mobile Computing
Co-founder—and now CEO—Tom Leighton plans data-prioritization trials with Ericsson and massive use of distributed devices for transmitting video.
As Tom Leighton prepares to take over as CEO at Akamai Technologies—the company he co-founded 14 years ago to optimize how Web traffic is delivered—he’s got mobile data on his mind and a suite of new technologies in the wings.
He says his company, which has 119,000 servers and delivers between 15 and 30 percent of the Web’s traffic, is working on technologies that will lead to services that prioritize different kinds of mobile Web traffic, use distributed devices for storing or “caching” video content, and compress photos selectively depending on what device you are using.
Leighton, who is currently the company’s chief scientific officer, takes over as CEO on January 1 from Paul Sagan (see “Q&A: Paul Sagan”), who will remain as a senior advisor.
Leighton sat down with David Talbot, chief correspondent of MIT Technology Review, to explain how Akamai plans to get content to mobile devices faster.
How well do mobile networks deliver Internet content?
The performance of mobile devices, especially on cellular networks, can be very poor. If you look at download speeds from top commercial sites, they’re equivalent to landline speeds nine years ago—that’s like the Dark Ages of the Internet. Users expect it to be like TV – bang, it’s on. Change a channel and it just changes. They think the Web is supposed to work that way.
You launched a new site performance service in October, Aqua Ion, that you say will speed this up.
If a site is picture-rich, we can reduce the amount of bandwidth you need to get served. With Aqua Ion, when you go to a website, you actually go to an Akamai server on the network. We detect what device you are using, and we know how well it’s connected. We will compress the picture or other object depending on what kind of device and connection you have. That relieves the network and provides a better experience for the user.
What’s the next step for this technology? Something not commercial yet?
We are working on this for HTML, to find chunks of HTML that are common from one page to another, break it into little pieces, and find the pieces that we can cache that you’ve probably seen before. This might account for 95 percent of a Web page. Then we can figure out, with good accuracy, what content you will be looking for when you visit a Web page based on usage profiles which include IP address, location, access speed, and type of device. And we pre-stage all that content on a server near you, or better yet on your device. It’s really fast because what you are looking for might already be on your device.
What can you do to speed up video?
For major events today—like an NFL or major league baseball game—you are using Akamai without knowing it. The video is being delivered from an Akamai server near you. The next generation of that technology will make greater use of end-users to distribute content. We call it client-assisted delivery. If you have a well-connected, well-powered machine, your machine can be used to send the information to a neighbor. Then we don’t have to send all the information to each neighbor individually.
Right now, 30 million devices, typically laptops or desktops, are doing this. But there are billions of devices out there. We would like to move this into tablets and other well-connected devices. We want to be on every device – anything that has the right connectivity, and the right CPU and the right memory. This enables us to achieve great scale, lower cost, and high quality.
What other technologies and strategies are you working on?
We are in trials with Ericsson, which makes equipment for carrier networks, to provide an end-to-end quality-of-service guarantee that can be sold as a service. The value of a bit for an e-commerce site—say, you are making a $300 ticket purchase—might be five orders of magnitude more valuable than a bit sent as part of a movie trailer. We will tag traffic so that it is identified as having a higher quality of service guarantee. It allows the carrier to better monetize their traffic. I am hopeful you will see this in the next year.
Doesn’t that violate ‘net neutrality’—the idea that no bits should be given priority over others?
No. The net neutrality concern arises when networks give preferential treatment to their own media content to the exclusion of other content owners. This technology allows anybody to have equal access, but those who want to pay more for the quality-of-service can do so. Then everyone has equal access to the faster service.
Do we have a spectrum crunch right now? If not, how long will spectrum hold out?
Yes. Already in major cities or at sporting events, it is very hard to get an uncongested connection.
Are efforts like Google Fiber in Kansas City, and the Chattanooga deployment—both of which are giving one-gigabit-per-second speeds, 100 times the average connection speeds to U.S. customers—solving problems in those areas?
You can’t measure the capacity of the Internet from the last mile connection. Just because you have that 100-megabit or even one-gigabit connection from your house to some local data center doesn’t mean you are even going to get a five-megabit stream if you are getting service from a data center halfway across the country.