Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

A giant of modern technology passed away recently, after a battle with cancer and other ailments. His efforts transformed the computing industry, and the fruits of his life’s work animate the products that surround us. This may sound like late news to you, but I’m not writing about Steve Jobs. Rather, the man in question is Dennis Ritchie, who died last week at age 70, and without whom not only Apple products but countless other technologies might look very different.

Dennis M. Ritchie, who was often know by the handle dmr, made two transformative contributions to technology.

First, he created the C programming language (with some help, particularly from Ken Thompson, his colleague at Bell Labs, which Ritchie joined in 1967). The C language lives on today, as do its successors C++ and Java. It wasn’t the first programming language, of course, but it was an especially important one, embodying an optimal level of abstraction—intuitive enough to easily grasp, while technical enough to get the job done. “It lets you get close to the machine, without getting tied up in the machine,” Brian Kernighan, a Princeton computer science, told the New York Times for its obit. C was an all-purpose language, a kind of lingua franca for the technological age, intended to spur collaboration.

The structure of C—its eminent usability—was almost a political statement; Ritchie once called it “a system around which fellowship can form.” Reading about the development of C—and Ritchie wrote a fairly exhaustive account here—you begin to feel that its invention was a crucial branching moment, one in which a few visionaries glimpsed capabilities overlooked by their peers.

Ritchie’s second great achievement was to co-develop the Unix operating system. Unix would eventually spawn Linux, on which many of the world’s data centers run. Many of the world’s popular operating systems, too, particularly in mobile—Android and iOS, for instance—in one way or another descend from Unix. “[P]retty much the entire Intnernet runs on” the Unix kernel, Rob Pike of Google told Wired.

Shortly after the news of his death was announced–the first mention online appears to have come from the Pike’s Google+ profile (he was a friend and former colleague)—encomiums began accruing on YouTube and elsewhere.

A recurring theme in Ritchie’s writings is his humility. “I was not smart enough to be a physicist,” he wrote self-deprecatingly in this brief biography. And while conceding that C was an “enormous success,” he also called it “quirky” and “flawed.” And though he won the National Medal of Technology in 1998, his account of the experience is devoid of braggadocio, instead offering mischievous tidbits about how a member of his party swiped branded paper towels from the White House restroom.

What was said of Jobs is just as true, if not more so, of Ritchie: that he lives on, in some sense, in the transformed technological world he left us.

5 comments. Share your thoughts »

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me