Skip to Content
MIT News magazine

Theory and Practice

Sitting in on a Liskov lecture
December 21, 2009

Professor John Guttag introduced Liskov, greeting the crowd of more than 150 people wearing a Yankees cap. The cap, he claimed, had a purpose other than pouring salt in the wounds of disappointed Boston Red Sox fans. His PowerPoint presentation was titled “Barbara Liskov: the Derek Jeter of Computer Science”–a reference to the Yankees’ star shortstop that drew hisses from the audience but underlined the talent and accomplishment of his subject. Guttag’s presentation also featured photos of Liskov dressed in Renaissance clothing and putting a tray of cookies in the oven–“moonlighting as a baker to put her son through Harvard,” Guttag said, “which was all the more poignant, since he could have gone to MIT for free.”

Guttag turned serious long enough to describe Liskov as one of his own most valued mentors, and then she took the podium. She opened with an anecdote of her own: After the news came out that she’d won the Turing Award, she said, her husband spent a lot of time on the computer Googling the reaction, and at some point, “he came upon a quote from someone who said, ‘What did she get this award for? Everyone knows this anyway.’”

In the 1970s, however, it was emphatically not the case that “everyone knew this,” and Liskov described in great detail the intellectual environment of the time. The talk was not for the uninitiated. She began by describing several papers from the early 1970s from which she had drawn inspiration–papers with titles like “Go To Statement Considered Harmful” and “Information Distribution Aspects of Design Methodology.”

Liskov explained that in the fall of 1972, after reviewing the literature in the field, she came up with the idea for what she called abstract data types. Traditionally, a computer program would be a long list of exhaustively detailed instructions, and anyone reading the code–including the original programmer–could easily get lost. Abstract data types are, effectively, repositories for the computational details of the program, which let the programmer concentrate on the big picture. A complicated program turns into some rather simple interactions between the abstract data types. And indeed, the programmer can later change the details of the data types’ instantiation–how they do their low-level computations–without changing the overall structure of the program.

Liskov recalled how she and some collaborators created the programming language CLU in order to field-test the concept of abstract data types. The rest of her talk was largely a demonstration that CLU prefigured most of the ideas now commonplace in programming languages–ideas such as polymorphism, type hierarchy, and exception handling.

During the question-and-answer session that followed, Liskov was asked the secret of her success. Part of her answer–which must have chagrined some members of the audience–was that she doesn’t work long days. “I always went home at night, and didn’t work in the evening,” she said. “I always found that downtime to be really useful.” She also emphasized the importance of pursuing research that excites you–rather than, say, the research that will generate the most publications. That way, she said, “at the end, if you fail, at least you did something interesting, rather than doing something boring and also failing.” After the laughter died down, she added, “Or doing something boring and then forgetting how to do something interesting.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.