Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo


Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

In 1948, the world was still an analog place. Candid Camera and Ed Sullivan were just beginning their long runs on TV; Jack Benny’s radio show had tens of millions of listeners. But bad reception was a fact of life. Electromagnetic interference, physical obstacles between a transmission tower and a receiver, and other sources of what engineers call “noise” routinely disrupted Benny’s monologues or the performances of Sullivan’s guests. In most areas, for at least some stations, people resigned themselves to snowy images or static-plagued audio.

That same year, however, Claude Shannon, SM ‘40, PhD ‘40, published a landmark paper in which he mathematically proved that even in the presence of a lot of noise, it was possible to transmit information with virtually no errors. It was an analog world, but Shannon’s stunning conclusion was the result of his ability to think digitally. Information in any medium, Shannon argued, could be represented using binary digits, or “bits”–a word that his paper introduced to the world. While noise in a communication channel can corrupt the bits, he explained, adding extra bits that are related to the original bits by some known algorithm–an error-correcting code–will make it possible to deduce the original sequence.

The noisier the channel, the more extra information must be added to make error correction possible. And the more extra information is included, the slower the transmission will be. ­Shannon showed how to calculate the smallest number of extra bits that could guarantee minimal error–and, thus, the highest rate at which error-free data transmission is possible. But he couldn’t say what a practical coding scheme might look like.

Researchers spent 45 years searching for one. Finally, in 1993, a pair of French engineers announced a set of codes–“turbo codes”–that achieved data rates close to Shannon’s theoretical limit. The initial reaction was incredulity, but subsequent investigation validated the researchers’ claims. It also turned up an even more startling fact: codes every bit as good as turbo codes, which even relied on the same type of mathematical trick, had been introduced more than 30 years earlier, in the MIT doctoral dissertation of Robert Gallager, SM ‘57, ScD ‘60. After decades of neglect, Gallager’s codes have finally found practical application. They are used in the transmission of satellite TV and wireless data, and chips dedicated to decoding them can be found in commercial cell phones.

The Birth of Information Theory

Gallager came to MIT in 1956–the same year Shannon himself returned as a professor, after 15 years at Bell Labs. But it wasn’t the prospect of working with Shannon that led him to choose MIT over Yale, where he had also applied to graduate school. “I was in the army–on a meaningless assignment–and I really hated what I was doing,” says Gallager, who taught at MIT for more than 40 years after earning his doctorate and still advises graduate students as a professor emeritus in the Research Lab of Electronics. “MIT started one week earlier than Yale did. And I was so anxious to get out of the army that that was really my only reason for coming to MIT.”


2 comments. Share your thoughts »

Credit: Courtesy of the MIT Museum

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives


Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me