Skip to Content
MIT Technology Review

  • Tracey Ho


    Todays Internet transmissions chop files into packets, each of which is passed from router to router until it reaches its final destination. But when files get big or are sent to many users, transmitting them without clogging the network becomes complicated. With “network coding,” an idea first proposed in 2000, routers would jumble together the bits from different packets, forming new packets. Recombining the data in this way gives the end user additional information, theoretically speeding downloads and increasing network capacity. But early network coding schemes required a godlike central authority that knew how the packets were to be combined – a practical impossibility. As a PhD student at MIT, Tracey Ho had a novel alternative: let network nodes mix packets together at random, tagging them with just enough information to help end users computers recover the original data. This decentralized method automatically optimizes bandwidth use. “It sounds kind of insane,” says Muriel Medard, Hos PhD advisor. “But its not just that it works; you cant make it work better.” As an assistant professor of electrical engineering and computer science, Ho still studies network coding. But only months after she first presented her “distributed random network coding” scheme, Microsoft researchers showed that it can clearly outperform todays multicast systems. The company has embarked on a project called Avalanche to commercialize the scheme.