For the first time ever, this year’s Super Bowl was streamed live online, to the initial delight, and then disappointment, of fans, who experienced poor image quality and delays of several minutes.
The man behind the BitTorrent file-sharing protocol, Bram Cohen, is developing software that could fix such problems. This protocol facilitates the distribution of large files by having users serve up fragments of a file to other users as they download it. This makes it possible to share very large files without a single central server.
Cohen’s company, also known as BitTorrent, is now working on technology that could allow a person with far fewer resources than a TV studio to stream live footage to an audience of millions.
“To this day, most of the live video people consume isn’t over the Internet, it’s over the cable system,” says Cohen. “Cable is very well optimized for sending out live events at a low delay, which has, to date, been very hard over the Internet.”
Cable and over-the-air TV can efficiently target multiple people because viewers can all tune into the same stream of content, but the Internet’s design requires a separate copy to be sent to each person. That makes streaming online video of any kind expensive (even Google hasn’t found a way to make YouTube profitable yet) and live video especially so. Specialized servers and other infrastructure are typically required.
Cohen’s solution, known as BitTorrent Live, could make it possible for almost anyone to offer a live stream to millions. Like the original BitTorrent, the scheme relies on viewers all running software that links into a network that distributes data directly between users, in what is known as a peer-to-peer design, which is much more efficient than every user being served by a central server. A key benefit of the approach is that as more people try to download a file, the network’s capacity to serve that file also grows.
Cohen has been running public trials of BitTorrent Live since late last year, streaming live DJ sets from San Francisco. So far, the system has been able to deliver live video with less than five seconds of delay, although the largest audience so far has been roughly 350 viewers.
Cohen says that his main interest is in the technical challenge, but adds that changing how people consume video does matter to him. “I view the one-way nature of television as a bad thing, and moving to a medium with integrated interactivity and social features will be a good thing,” he says.
Reducing the resources needed to offer live video could even help journalists, bloggers, or protestors trying show the world what they are witnessing. Online video was important to last year’s protests in Egypt and other Middle Eastern countries; live feeds with large audiences could have an even larger impact.
Cohen says BitTorrent Live could even help conventional broadcasters and studios distribute their premium TV and movie content more efficiently. He says the approach may even be better than centralized streaming services, which typically cap quality to keep costs down. However, as with the original BitTorrent protocol, the software could potentially be used to stream copyrighted material—without providing a single point where it can easily be shut down.
It’s too early for Cohen to say exactly how BitTorrent Live will be made available, and whether (like his original blockbusting technology) it will be done in a way that allows third parties to use it as they wish. “The protocol is still early in development, so we want to maintain the ability to [improve] it in a reliable way,” he says. “We’ll have more information on potential third-party implementations once we are closer to launch.”
BitTorrent file sharing software breaks files into many small pieces so that when a person asks their software to find a download it can assemble it from pieces sourced from many other users, and simultaneously share the pieces it already has. BitTorrent Live uses the same strategy, but with the added constraint of having to distribute data that must reach users at just the right time to keep the live stream working.
That makes lag the biggest challenge for BitTorrent Live, Cohen acknowledges. “Typically, latency may be between one and three seconds longer than it is in a very well optimized centralized system,” he says. But the five-second delay achieved so far is likely acceptable for most use cases. Cohen designed BitTorrent Live to automatically drop connections to peers that are unable to serve up fresh data, and to assign high priority to connections that can, to ensure data spreads widely as fast as possible.
Arvind Krishnamurthy, an associate professor at the University of Washington with an interest in peer-to-peer systems, says Cohen’s approach could unleash a new distribution mechanism for video. “I can start a stream that, if it is of interest, many millions of people can watch it,” he says. “You don’t have to be YouTube. It’s self-scaling in nature.”
Vindication that peer-to-peer video can serve large audiences comes from China, he notes, where TV services such as PPLive and QQLive serve hundreds of thousands of viewers at once, says Krishnamurthy. However, those services do not typically offer live video or the type of quality that Web users in the West are used to.
Cohen’s biggest challenge may be to replicate his success in designing the original BitTorrent protocol to discourage selfish users from downloading but never uploading, says Krishnamurthy. That was done by having peers in a BitTorrent network provide more data to other peers that reciprocate, a strategy that can’t work with streaming video data because it gets stale fast. “At any point in time, the data of interest to me is the next 30 seconds of data,” says Krishnamurthy. “It’s a small number of data blocks, and that makes it harder to make peer-to-peer streaming work.”
Krishnamurthy and colleagues used data from PPLive in China to test a scheme of their own that reduced the numbers of users who received incomplete streams by more than a quarter. In that system, users who passed on video data in a timely way to others themselves received data at a higher quality and with less delay. Their design also routed the newest video data via users with the most bandwidth to spare to ensure that it spread out more quickly inside the network.