Supercharged File Sharing
Cooperating with file-sharing networks could avert congestion.
As Internet service providers (ISP) struggle with increasing traffic from peer-to-peer file-sharing networks, some have resorted to simply throttling this data, attracting ire from both users and regulators. Under a scheme that should be rolled out early next year, some ISPs plan to take a different approach: cooperating with file-sharing networks so that they share data more effectively.
The new scheme is called Provider Portal for Applications (P4P), and it’s a voluntary, open standard that requires ISPs to share some information about how their networks are laid out. Initial tests have shown that the P4P framework can dramatically speed up download times for file sharers while also reducing the bandwidth costs for ISPs.
Peer-to-peer file sharing has exploded over the past decade, driven by increasing consumer bandwidth and growing demand for large amounts of data. Rather than serve files from a centralized location, file-sharing networks scatter pieces of among thousands of individual computers and help users find and download this data. File sharing now accounts for about 70 percent of all network traffic, and some ISPs have found it hard to deal with the increased load. In August, Comcast was rebuked by the Federal Communications Commission for trying to throttle peer-to-peer traffic on its network.
The new protocol reduces file-trading traffic by having ISPs reveal some internal network information to peer-to-peer “trackers”–servers that are used to locate files for downloading. Trackers can then use this network information to arrange file sharing more efficiently, by connecting computers that are nearer and sharing files at the lowest resource cost to the ISPs involved. As an example, suppose someone running a BitTorrent client tries to download an MP3. As it stands, the file might come from a computer halfway around the world, even if someone next door also happens to have a copy. By using P4P, the tracker knows to connect computers that are closer together, requiring bits to travel less distance.
“We knew, as a peer-to-peer company, that in order for peer to peer to become successfully commercialized, network operators had to be cooperative,” says Robert Levitan, CEO of Pando Networks, a company that offers commercial peer-to-peer content delivery services. “Instead of blocking traffic, they had to get involved in it.”
Pando is a founding member of the P4P Working Group, a consortium set up in 2007 to develop and test new technologies to make P2P more efficient. Members include the ISPs Verizon and Comcast, the peer-to-peer software business BitTorrent, the network equipment manufacturer Cisco Systems, and academic institutions including Yale and Washington University.
Small-scale tests conducted in March by Yale researchers, Pando, Verizon, and Telefonica Group suggest that the system could cut the average distance that data has to travel from 1,000 miles to 160 miles, and reduce the number of connections that have to be made through major hubs from 5.5 to 0.69. This would help ISPs avoid the costs incurred when information is handed between major networks. The approach could also benefit users, by increasing download speeds by an average of 20 percent, according to the same tests.
A more recent study carried out this fall with Comcast, Verizon, and AT&T showed that peer-to-peer download speeds could increase 50 to 150 percent using the technology. And the amount of content that is delivered entirely within each ISP should increase from 14 percent to as much as 89 percent.
But the P4P approach is not without its challenges. The protocol depends on ISPs calculating and making available “p-distance values” to peer-to-peer trackers, to tell them how best to connect different file sharers. There are also legal questions. Because many files traded on peer-to-peer networks violate copyright, ISPs will want to make sure cooperating with P2P networks won’t make them responsible.
Nonetheless, Richard Woundy, senior vice president for software and applications with Comcast, admits that the idea is appealing. “The ISP benefits because traffic isn’t going over as much infrastructure,” he says. “It’s staying within a metro area, or at least staying within the ISP. It’s not going over a transit link to an upstream provider.”
Doug Pasko, principal member of the technical team at Verizon, says that Pando and Verizon have plans to roll out a P4P implementation soon, possibly by the end of January. The P4P working group has also submitted an application with the Internet Engineering Task Force to seek official approval for the P4P standard. And Pasko doesn’t think that legal problems are likely. “P4P itself doesn’t increase our legal exposure,” he says. “That’s because we’re offering optimization guidance. We don’t have any information on what that content is.”
Comcast is also interested in implementing the technology, says Barry Tishgart, vice president for Internet services for the company. “Our inclination is, we want to do it. The results of our trial are very positive,” he says. But the tests carried out so far have been relatively small: the one performed this fall shared a single 21-megabyte video file, which was downloaded 15,000 times. So Tishgart wants to see what happens when larger file sizes and large swarms of peers try to download a popular file.
Finally, the success of the scheme depends on the thousand or more peer-to-peer trackers that currently exist agreeing to use the P4P protocol. Tishgart says that they tend to be suspicious of the ISP’s motives. But if they see performance gains for their users and no downside, then they may be much more likely to cooperate.
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today