Peer-to-Peer Comes Clean
They’re not just for file-sharing anymore: P2P networks are transmitting phone calls, blocking spam, backing up hard drives, and spreading scholarship.
Despite ongoing efforts to blackball it, peer-to-peer technology is fast gaining ground. P2P gets its bad reputation for being the mechanism that powers those massive copyright-violation systems like Kazaa and Morpheus. But as I wrote a year ago in this space, this technology can be used for good: It has the power to strengthen the Internet against terrorist attack, allow even the smallest publishers to distribute information to the multitudes, and protect controversial information against censorship and suppression.
What I did not anticipate a year ago was that the most important peer-to-peer application to emerge in 2004 would be telephony. Yet that’s what happened with Skype, the bandit Internet voice telephony system that has served more than 1.7 billion minutes of peer-to-peer telephone calls since its debut in August 2003.
But Skype is just one of several emerging peer-to-peer systems. Another is LionShare, a project started by Penn State University with a grant from the Mellon Foundation to create a series of networks for sharing scholarly information among academics. The system is designed to let individuals index and otherwise manage their personal files, then make these files available throughout a P2P network.
Many instructors, scholars, researchers, and librarians across higher education institutions have hidden’ repositories of digital content used for teaching, research, and outreach stored on their networks or even individual hard drives, reads the LionShare grant proposal. The goal of LionShare is to open up this content into a federated search system so that a single search query [could] reach all available repositories, allowing academics to share photographs, sounds, instructional videos, and even PowerPoint presentations to a degree never before possible.
Of course, professors could just put their materials on websites and let Google handle it all. But as anybody who has tried this knows, there is no easy way to specifically ask Google for contemporaneous photographs of Victorian houses in New England, authenticated by architectural experts, and available for royalty-free use in academic publications. The problem here is that Google does a lousy job with metadata and other kinds of catalog information–the sort of stuff that makes and breaks academic careers. LionShare will give researchers a tool for cataloging their own collections and then export those catalogs throughout academia.
There are a number of other potentially great P2P systems out there as well. BitTorrent, by Bram Cohen, is designed to let small software publishers distribute their wares to a large eager audience. Instead of hosting popular downloads on hugely expensive server farms, the idea of Bittorrent is to replicate popular downloads across hundreds or thousands of individually owned PCs. Think of it as Akamai for the little guy: in theory there shouldn’t be any danger in copying files to multiple machines that you don’t own, provided that every file is digitally signed. Unfortunately, it’s beginning to look like the project has stalled. Still, the idea is fundamentally sound and it’s sure to be extended in the coming years.
And then there’s a clever peer-to-peer system called Vipul’s Razor, which is being used to filter spam. A small software agent runs on every computer attached to the network. This agent detects when e-mail arrives. The theory is that if the same message appears in multiple locations at more-or-less the same time, it’s probably spam. This approach is an excellent complement to content-based anti-spam systems: the content systems identify spam that looks like spam, while Razor identifies mail that is sent the way spam is typically sent, no matter what it looks like. Support for Razor is built in to the popular SpamAssassin anti-spam system. In an examination of my spam from September, Razor identified one out of three spam messages–pretty good considering that it doesn’t use any keywords at all.
Peer-to-peer has been an active area of academic research as well. Much of the research has focused on trying to create so-called distributed hash tables, or DHTsdatabases that are shared between multiple computers all over the Internet. The best systems automatically find the computers on the Net that are part of the DHT, store data on redundantly on multiple machines, use digital signatures and encryption to protect the information, and even include distributed reputation, trust, and payment systems to keep all of the participants honest and motivated. This sounds like just the sort of technology that companies like Kazaa should be gaga over. Strangely, however, most of the bandit MP3 networks have stayed away from the academic DHTs and have installed their own systems.
Fortunately, many academics are trying to push their research more toward real-world applications. In August, for instance, the Institute of Electrical and Electronics Engineers held its Fourth International Conference on Peer-to-Peer Computing in Zurich, Switzerland. Among the highly technical presentations were papers on how to make unstructured peer-to-peer networks really big, more techniques for securing peer-to-peer systems against attack, techniques for squeezing more functionality out of less bandwidth, and how to build P2P networks that respond to changes in the underlying Internet. Though highly technical the papers nevertheless make interesting readingespecially if you are an entrepreneur looking for a new company idea.
Indeed, there are plenty of other peer-to-peer applications waiting out there in the wings, perhaps getting ready to be the next Skype. A few weeks ago, for instance, MIT graduate Tim Macinta put out the first beta version of Magic Mirror Backup, a peer-to-peer system that automatically backs up computers throughout your home or office to each other. (Various alpha versions of Magic Mirror have been in circulation for about a year.) The idea is to put all of those unused gigabytes on your various hard drives to use backing up each other. And last year Microsoft published a Windows XP Peer-to-Peer Software Development Kit, which, the company says, contains all software required to create decentralized applications that harness the collective power of edge of the network PCs. I haven’t yet heard of anything that was created with it, but perhaps next year we will all be astonished.
Most people who use the Internet today are accustomed to the idea that there are low-cost clients on people’s desktops and expensive servers closeted away in expensive telecom hotels with high-quality power and lots of bandwidth, there is nothing inherently client-server in the Internet’s underlying architecture or design. As Internet service providers deliver more bandwidth to homes and small businesses, and as desktop hard drives grow ever larger in size, we are sure to see more approaches for harnessing these underutilized resources.
In the future, peer-to-peer may be the norm, and we may look back at today’s client-server systems as some sort of weird, unreliable, transitory technology.
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today