Skip to Content

P4P Remodels File Sharing

Peer-to-peer routing technology being tested by Internet service providers could change networking.
November 17, 2009

“Peer-to-peer” (P2P) is synonymous with piracy and bandwidth hogging on the Internet. But now, Internet service providers and content companies are taking advantage of technology designed to speed the delivery of content through P2P networks. Meanwhile, standards bodies are working to codify the technology into the Internet’s basic protocols.

Rather than sending files to users from a central server, P2P file-sharing networks distribute pieces of a file among thousands of computers and help users find and download this data directly from one another. This is a highly efficient way to distribute data, resistant to the bottlenecks that can plague centralized distribution systems, but it uses large amounts of bandwidth. Even as P2P traffic slowly declines as a percentage of overall Internet traffic, it is still growing in volume. In June, Cisco estimated that P2P file-sharing networks transferred 3.3 exabytes (or 3.3 billion trillion bytes) of data per month.

While a PhD student at Yale University in 2006, Haiyong Xie came up with the idea of “provider portal for peer-to-peer,” or P4P, as a way to ease the strain placed on networking companies by P2P. This system reduces file-trading traffic by having ISPs share specially encoded information about their networks with peer-to-peer “trackers”–servers that are used to locate files for downloading. Trackers can then make file sharing more efficient by preferentially connecting computers that are closer and reducing the amount of data shared between different ISPs.

During its meetings last week in Japan, the Internet Engineering Task Force, which develops Internet standards, continued work on building P4P into standard Internet protocols. However, Xie believes that those efforts will take two or three more years to come to fruition. In the meantime, he says, many P2P application makers and Internet carriers are already implementing their own versions of P4P.

Pando Networks, which facilitates Internet content delivery, was the first company to adopt P4P techniques. In collaboration with Xie, Pando worked with Verizon, Telefónica, AT&T, and Comcast to run two sets of P4P tests last year; the results showed that P4P could speed up download times for file sharers by 30 percent to 100 percent, while also reducing the bandwidth costs for ISPs. Since then, Verizon and Telefónica have both implemented versions of P4P within their networks, though the network maps may not be available in all regions or to every P2P provider. Several other ISPs are considering implementing P4P, Xie says; Comcast, for instance, publicly stated its interest in the technology following last fall’s trial.

Robert Levitan, Pando’s CEO, says that the company used the expertise it gained through those trials to develop algorithms that automatically derive network maps, based on information gathered from software installed on individual users’ machines (more than 30 million computers have Pando’s media booster software installed). The company uses the maps to help route content more quickly to those same computers. The company’s clients include Nexon America, one of the largest free-to-play online video-game companies, and, which uses P4P to deliver full-length HD shows over the Internet.

Indeed, Xie says, as multimedia becomes more and more dominant on the Internet, demand for P4P implementations will grow, particularly from ISPs seeking to lower the amount of money they need to spend on new fiber and inter-ISP data transmissions. Video and audio streaming from sites such as YouTube and Hulu already accounts for almost 27 percent of global Internet traffic, according to a report by network-management systems vendor Sandvine. Cisco predicts that by 2013, video alone will account for over 60 percent of all consumer Internet traffic. With this kind of increase in high-bandwidth traffic, Levitan says, “we’re not going to be able to have the Internet we all want” without P4P, or a similar technology, to help scale the physical networks at a reasonable cost.

Xie and Levitan see two main difficulties for the continued growth of P4P. The first is P2P’s association with software, music, and video piracy. ISPs want to make sure that working with P2P companies to improve their service won’t make them liable for any illegal file sharing. But Levitan is optimistic that increasing numbers of legal uses for P2P technology will help reform its image. For example, Internet telephony service Skype relies on P2P connections, as does Blizzard Entertainment, maker of the popular online game World of Warcraft. began using Octoshape’s P2P technology to boost its delivery of live streaming video earlier this year, and the PGA, NBA, and NASCAR all use it to support live webcasts of sporting events.

The other potential problem is perhaps trickier: even though P4P benefits both consumers and ISPs, because it treats P2P traffic differently than other data flowing over the Internet, it could technically violate the Federal Communications Commission’s proposed net neutrality regulations. In fact, one of Xie’s original motivations in developing the P4P protocols was to help carriers avoid having to limit P2P traffic for cost reasons, as Comcast did–much to consumers’ ire–in 2006. He admits that P4P would seem to violate the letter of net neutrality, if not the spirit, by “helping” P2P applications preferentially. “I don’t have a good, clear answer to those concerns,” Xie says. Still, he and other P4P proponents remain optimistic that the technology’s advantages will win the day.

Levitan thinks that the benefits such companies are seeing will allow P4P to move forward. “On a technology basis, and even from a policy basis, I think the FCC could see–wow–this could really help networks, and maybe it changes the network neutrality debate,” Levitan says– because there wouldn’t be a scarcity of network capacity anymore.

Keep Reading

Most Popular

This new data poisoning tool lets artists fight back against generative AI

The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models. 

Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist

An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.

Data analytics reveal real business value

Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.

Driving companywide efficiencies with AI

Advanced AI and ML capabilities revolutionize how administrative and operations tasks are done.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.