Skip to Content

Computing’s Johnny Appleseed

Almost forgotten today, J.C.R. Licklider mentored the generation that created computing as we know it.

Often, says Tim Anderson, thinking back to the mid-1970s and his time as a student at MIT’s Laboratory for Computer Science, you’d walk into the terminal room and there he’d be: Professor J.C.R. Licklider, typing code with his own 10 fingers.

This took some getting used to. Lick, as everyone called him, wasn’t a hacker, but an absent-minded-professor-type in his 60s. “He’d sit there with a bottle of Coke and a vending machine brownie as if that were a perfectly satisfactory lunch,” recalls Anderson, who is now the chief technology officer at an Internet startup known as Offroad Capital. “He had these funny colored glasses with yellow lenses; he had some theory that they helped him see better.”

Anderson wasn’t sure what Lick was working on-something to do with making computer code as intuitive as ordinary conversation, and as easy as drawing a sketch. The programs he wrote weren’t so hot, but that almost didn’t matter. For Lick the important thing was imagining the future-and an astonishing amount of what we now take for granted owes its origins to his work. He would hold forth for hours in his wry Missouri accent, spinning visions of graphical computing, digital libraries, online banking and e-commerce, computers with megabytes of memory, software that would live on the network and migrate wherever it was needed-all this 10 years before the Macintosh, 20 years before the popularization of the Web.

What Lick never got around to mentioning was that he had done as much as anyone on earth to make such wonders possible. In fact, the big, rumpled guy in the corner office had laid the foundations for time-sharing, point-and-click interfaces, graphics and the Internet-virtually all of modern computing. “He was clearly the father of us all,” says Anderson. “But you’d never know it from talking to him.”

Mind Meets Machine

Such modesty was bred into Licklider at an early age. Back in St. Louis, where he was born in 1915, a self-satisfied man was said to have too much “side”-a reference to the fatty flanks of a hog. And little Robnett, as Joseph Carl Robnett Licklider was known as a boy, had been raised to think “side” was unseemly. Every evening from the time he was 5, it had been his duty and honor to take the arm of his maiden aunt, escort her to the dinner table, and hold out her chair. Even as an adult, Lick was a remarkably courteous man who rarely raised his voice in anger and who found it almost physically impossible to remain seated when a woman entered the room.

A happy, energetic boy with a lively sense of fun, Licklider early on displayed an insatiable curiosity and a love of all things technological-especially cars. At 15, he bought an old junker and took it apart again and again, trying to figure out its inner workings. For years thereafter, he refused to pay more than $50 for a car; whatever shape it was in, he could fix it up and make it go.

At Washington University in St. Louis, he wanted to major in everything-and almost did. He graduated in 1937 with a triple degree in physics, math and psychology, with particular interest in deciphering the ultimate gadget: the brain. For his doctoral dissertation at the University of Rochester, he made the first maps of neural activity on the auditory cortex, pinpointing the regions crucial to our ability to hear musical pitch.

Ironically, this passion for psychology would be central to Lick’s pathbreaking work in computing. Most computer pioneers came to the field in the 1940s and 1950s with backgrounds in math or electrical engineering, leading them to focus on gadgetry: making the machines bigger, faster and more reliable. But Lick’s study of neuroscience left him with a deep scientific appreciation for the human capacity to perceive, to adapt, to make choices, and to create new ways of tackling problems. To Lick, these abilities were every bit as subtle and as worthy of respect as the automated execution of a series of instructions. And that’s why to him, the real challenge would always lie in adapting computers to the humans who use them, exploiting the strengths of each.

Lick’s instincts in this direction were apparent by 1942, when he joined Harvard’s Psycho-Acoustics Laboratory. The Army Air Force was funding a team of psychologists at that lab to attack the problem of noise. The United States had just entered World War II, and aircraft crews were finding it difficult to function amid the overwhelming din of the engines. Lick devised a method for artfully distorting radio transmissions to emphasize consonants over vowels and thus make words stand out against a background of radio static and mechanized cacophony. Already, he was shaping the technology to fit the human, not the reverse.

That sensibility asserted itself even more after 1950, when Lick moved to MIT. Almost immediately, he got caught up in Project SAGE-a crash program to create a computer-based air-defense system against Soviet long-range bombers. The computer in SAGE was Whirlwind, which had been under development at MIT since 1944. Other early computers, such as ENIAC, had started out as giant calculators, with an operating style to match: You entered the numbers and eventually got back a printout with the answer. This came to be known as batch-processing. Whirlwind, by contrast, had started out as a flight simulator and had evolved into the world’s first real-time computer: It would try to respond instantly to whatever the user did at the console. The challenge was to prove that a computer could take the data coming in from a new generation of air-defense radars and display the results rapidly in a meaningful form.

The project succeeded. Although high-flying, fast-moving ICBMs had made the air-defense system obsolete by the time it was finally deployed in 1958, SAGE nevertheless served as a model for the interactive, real-time computers that followed-including modern personal computers. Lick headed SAGE’s human-factors team, and he saw the project as an example of how machines and humans could work in partnership. Without computers, humans couldn’t begin to integrate all that radar information. Without humans, computers couldn’t recognize the significance of that information, or make decisions. But together-ah yes, together…

By 1957, the year he left MIT for the nearby consulting firm Bolt Beranek and Newman, that train of thought was leading Lick down strange new paths. That spring and summer, he kept track of what he actually did during the day-with shocking results. “About 85 percent of my ‘thinking’ time was spent getting into a position to think, to make a decision, to learn something I needed to know,” he later wrote. He concluded that his decisions on what work to attempt “were determined to an embarrassingly great extent by considerations of clerical feasibility, not intellectual capability.”

Computers, he believed, would rescue the human mind from its enslavement by mundane detail. Human and machine were destined to unite in an almost mystical partnership, with computers handling rote algorithms while people provided the creative impulses. The hope, he said, was that “the resulting partnership will think as no human brain has every thought and process data in a way not approached by the information-handling machines we know today.” Lick found this vision of human-computer symbiosis so compelling that standard psychology could no longer compete. “Any psychologist is crazy to keep on working with people if he has access to a computer,” he said, only partly in jest.

And so he switched fields. In a 1960 paper called “Man-Computer Symbiosis,” published in the IRE Transactions on Human Factors in Electronics, Licklider formulated a new vision of computing. He described a machine that humans could relate to in the manner of “a colleague whose competence supplements your own”-a friend who could help when the problems got too hard to think through in advance. Such problems “would be easier to solve,” he wrote, “and they could be solved faster, through an intuitively guided trial-and-error procedure in which the computer cooperated, turning up flaws in the reasoning or revealing unexpected turns in the solution.”

Much easier said than done. Real-time computers were still a rarity in 1960, and far too expensive for individual use. Therefore, Lick concluded, the most efficient way to use this technology was to have the computer “divide its time among many users.” This was not an original idea; such “time-sharing systems” were already under development at MIT and elsewhere. But Lick, never one to hold his imagination in check, followed that notion to its logical conclusion: He described an online “thinking center” that would “incorporate the functions of present-day libraries.” He foresaw “a network of such centers, connected to one another by wide-band communications lines and to individual users by leased-wire services.” Any similarity to today’s Internet is not a coincidence. (To read Licklider’s seminal paper, go to

Networks would allow computers to communicate with one another. But Lick also saw a desperate need for better ways for humans to interact with computers. Punch cards and printouts were, he wrote, hopelessly impoverished relative to human communication via sight, sound, touch and even body language. His proposed solution: a desk-sized console that would function much like today’s personal computer, equipped with voice and handwriting recognition. He described a display surface “approaching the flexibility and convenience of the pencil and doodle pad or the chalk and blackboard.”

Lick pointed out the need for reference works distributed via cheap, mass-produced “published memory” (think CD-ROM); data storage that could access items by content, and not just by names or keywords (still difficult); and languages that would allow you to instruct the computer by giving it goals, instead of step-by-step procedures (even more difficult.) He also revealed his mixed feelings about artificial intelligence, then in its infancy. He saw it as being potentially very useful-but knew far too much about the brain and its complexities to believe that computers would soon be surpassing humans.

Although Licklider’s ideas were little more than visions in the late 1950s, technology was beginning to catch up. In the spring of 1960, a struggling young company called Digital Equipment Corp. introduced its first computer, the PDP-1. It was a real-time, interactive machine, and it came with a built-in display screen. It was the perfect machine for Lick to try to implement the research agenda laid out in “Symbiosis.” He and his team bought the display model off the exhibit floor for $120,000 (enough to make the BBN higher-ups blanch) and plunged in. They programmed their PDP-1 for some of the first experiments with educational software, including a language vocabulary drill written by Lick himself. They experimented with online search and data retrieval. They even worked on time-sharing-although the PDP-1, whose horsepower was roughly that of the original Radio Shack TRS-80, didn’t have much to share.

Building the ARPA Community

Lick would have happily continued this way indefinitely, had he not received a call in 1962 from the Department of Defense’s Advanced Research Projects Agency (ARPA). The Pentagon had formed ARPA five years earlier in the aftermath of Sputnik as a fast-response research agency, charged with making sure the United States was never again caught flat-footed. Now, ARPA wanted to set up a small research program in “command and control”: the ancient art of making timely decisions and getting those decisions implemented by your forces in the field. This was a critical matter in the nuclear age, and was obviously going to involve computers. And once ARPA director Jack Ruina heard Lick expound upon his vision of interactive, symbiotic computing, he knew he had found the right person to lead the effort.

Lick didn’t really want to leave BBN. But how could he say no? He would have $10 million a year to give away pretty much as he saw fit-no peer review, no second guessing from higher-ups. The ARPA style was to hire good people, then trust them to do their jobs. There would be no “cloak and dagger” stuff, as Lick called it; the research he funded would be completely unclassified. So long as he was advancing command and control, broadly defined, he could choose which projects to fund. In effect, Lick was being offered an opportunity to spend big money in pursuit of his vision of human-computer symbiosis.

He hit the ground running in October 1962. His strategy was to seek out the scattered groups of researchers around the country who already shared his dream, and nurture their work with ARPA funding. Within months, the “ARPA community,” as it came to be known, was taking shape. First among equals was Project MAC at MIT, founded with Lick’s encouragement as a large-scale experiment in time-sharing and as a prototype for the computer utility of the future. MAC-the name stood for both “Multi-Access Computer” and “Machine-Aided Cognition”-would also incorporate Marvin Minsky’s Artificial Intelligence (AI) Laboratory. Other major sites included Stanford, where Lick was funding a new AI group under time-sharing inventor John McCarthy; Berkeley, where he had commissioned another demonstration of time-sharing; Rand Corp., where he was supporting development of a “tablet” for free-hand communication with a computer; and Carnegie Tech (now Carnegie Mellon University), where he was funding Allen Newell, Herbert Simon and Alan Perlis to create a “center of excellence” for computer science. Lick had also taken a chance on a soft-spoken visionary he barely knew-Douglas Engelbart of SRI International-whose ideas on augmenting the human intellect with computers closely resembled his own and who had been thoroughly ignored by his colleagues. With funding from Lick, and eventually from NASA as well, Engelbart would go on to develop the mouse, hypertext, on-screen windows and many other features of modern software.

The trick, Lick knew, was to create a community in which widely dispersed researchers could build on one another’s work instead of generating incompatible machines, languages and software. Lick broached this issue in an April 1963 memo to “Members and Affiliates of the Intergalactic Computer Network”-meaning his principal investigators. The solution was to make it extremely easy for people to work together by linking all of ARPA’s time-sharing computers into a national system. He wrote:

If such a network as I envisage nebulously could be brought into operation, we would have at least four large computers, perhaps six or eight small computers, and a great assortment of disc files and magnetic tape units-not to mention the remote consoles and teletype stations-all churning away.

From the modern perspective, this little paragraph is electrifying-it is perhaps the first written description of what we now call the Internet. But Lick didn’t stop there. Clearly enamored by the idea, he spent most of the rest of the memo sketching out how people might use such a system. He described a network in which software could float free of individual machines. Programs and data would live not on an individual computer but on the Net-the essential notion of the Java applets now found all over the Web.

Lick couldn’t do much about his idea immediately, since networking technology wasn’t even close to being ready. So instead he talked (and talked, and talked), trying to sell the notion to anyone who would listen, confident that he was planting a seed that would grow.

Meanwhile, he had a program to run. Lick presided over his far-flung community in much the same way he’d run his research groups at MIT and BBN-with a mix of parental concern, irrepressible enthusiasm and visionary fervor. True, his nonstop stream of ideas and suggestions could be exasperating; the recipients sometimes felt as though their sponsor’s imagination was voyaging among the stars while they were still struggling to build a biplane. But Lick was more interested in being a mentor than a micromanager: As long as people made reasonable progress in the right direction, he would let them find their own way.

At ARPA, program managers traditionally moved on after a year or two to give someone else a chance, and Lick was no exception. But in September 1964, when he left ARPA for the IBM research laboratory, he took care to find a successor who shared his vision. His choice was Ivan Sutherland, a 26-year-old computer graphics genius from MIT’s Lincoln Lab whose doctoral project, Sketchpad, was the ancestor of today’s computer-aided design software.

Lick’s influence would continue to be felt at ARPA for more than a decade. Sutherland’s successor in 1966 would be Robert W. Taylor, who shared with Lick a background in psychology and who was probably Lick’s most enthusiastic convert to the symbiosis vision. It was Taylor who would inaugurate the actual development of Lick’s proposed computer network, which began operation in 1969 as the ARPAnet and ultimately evolved into the Internet. And it was Taylor who went on to head the computer group at Xerox’s Palo Alto Research Center (PARC)-where, during the 1970s, researchers turned Lick’s notion of symbiosis into a working system. PARC’s radical developments included the first graphics-based personal computer, the Ethernet local-area network and the laser printer. When Taylor left ARPA in 1969, he handed the reins to ARPAnet architect Larry Roberts, another computer graphics maven who had become intrigued with networking after a late-night bull session with Lick.

Lick always insisted, with characteristic modesty, that he had accomplished very little in his two years at ARPA. In a narrow sense, he had a point. Essentially nothing was happening in September 1964 that had not already been underway in one form or another when he arrived at the agency.

And yet, Licklider’s impact was profound. When ARPA presented him with a never-to-be-repeated opportunity to turn his vision into reality, he had the guts to go for it. Once he had the Pentagon’s money in hand, Lick had the taste and judgment to recognize good ideas and good people. He had the competence and integrity required to win their respect. And he had the overarching concept-human-computer symbiosis-that let each of his disciples feel like a part of something much larger than themselves. Most important, by funneling so much money into research at universities, where most of it actually went to support students, he guaranteed that his vision would live on after him.

“It seems to me that Licklider and ARPA were mainly about winning the hearts and minds of a generation of young scientists, and convincing them that computer science was an exciting thing to do,” says James Morris, chair of the Carnegie Mellon computer science department. “In the aftermath of Sputnik, the glamour field was physics, not computing. Lots of very smart people made a career decision to go into a field that didn’t exist yet, simply because ARPA was pouring money into it.”

Forgotten Revolutionary

As eloquent testimony to the success of Lick’s strategy, consider that during the late 1960s and early 1970s, at the height of the Vietnam debacle, when many people viewed governments and institutions of all kinds as instruments of oppression and punch-card belching mainframes as a potent symbol of tyranny, a rising generation of students was beginning to think of computers as liberating. This was the generation that would gather at Xerox PARC. And this was the generation-together with the students they taught-who would engineer the personal computer revolution of the 1980s and turn the ARPAnet into the Internet and then create the World Wide Web. The list is a long one, including Alan Kay of the University of Utah, who in 1968 came up with the notion of a notebook computer called the “Dynabook”; Dan Bricklin of Project MAC, who invented VisiCalc, the first electronic spreadsheet; Bob Metcalfe of Project MAC, inventor of Ethernet and founder of 3Com; John Warnock of Utah and PARC, founder of Adobe Systems; and Bill Joy of Berkeley, co-founder of Sun Microsystems. Even now, people who never heard of J.C.R. Licklider fervently believe in what he dreamed of, because his ideas are in the very air they breathe.

Why, then, have most people never heard of him?

One reason is that Lick wasn’t the kind of person modern-day computer journalists like to write about. He didn’t start a company, or create best-selling software. He wasn’t a mediagenic guru. He seemed to be just another government bureaucrat from back in technology prehistory. Moreover, Lick wasn’t even very successful as a bureaucrat, at least not after he left ARPA. Two exasperating years at IBM sent him back to MIT in 1966; the computer giant’s corporate culture was grounded so firmly in mainframes and batch-processing that Lick saw no chance to convert the company to human-machine symbiosis in his lifetime. His rocky stint as director of Project MAC, from 1968 to 1971, strained many an old friendship there; Lick’s loathing for paperwork made him a disastrous manager. A second tour at ARPA, from 1974 to 1975, was even worse: In the post-Vietnam environment, the free-wheeling computer research program he had founded was mired in demands for immediate military relevance. A colleague who watched him there likened it to a Christian being fed to the lions.

And Lick wasn’t a young Christian anymore. By the time microcomputers hit big in the early 1980s, he was pushing 70. Just as his ideas of personal computing and networking were coming to fruition, he was losing the vigor to contribute significantly to the cause. His hands had a noticeable tremor-a condition that would eventually be diagnosed as Parkinson’s disease. His allergies had crossed the line into asthma, and he never went anywhere without an inhaler. In the end, it was the asthma that finally caught up with him: An attack left his brain without oxygen too long, and Lick died without regaining consciousness in June of 1990.

But mainly, we haven’t heard of Lick because he refused to toot his own horn. He seems to have been one of those rare beings who genuinely didn’t care who got the credit, so long as the goal was accomplished. Psychologist George Miller, who worked with Licklider at Harvard during World War II, remembers him as “extremely intelligent, intensely creative, and hopelessly generous” with his ideas.

Forty years later, Stuart Malone discovered much the same quality. In the early 1980s, Lick had taken Malone and a number of other undergraduates under his wing. He made sure they had a space of their own, a common area they painted green and called “The Meadow.” He got them exclusive use of one of the lab’s VAX/750 computers, which they immediately equipped with a Unix password: lixkids. He had made them feel part of something much larger than themselves. And, of course, he had said not a word about his own past-which was why Malone was so astonished at Lick’s retirement dinner in 1985. “There were hundreds of people there from MIT, from DEC, from PARC, from the Defense Department,” he recalls, “all standing up and crediting Lick with giving them a chance to do their best work.”

David Burmaster, who had been Lick’s assistant at Project MAC, will never forget it. “I’d felt I was the only one, that somehow Lick and I had this mystical bond. Yet during that evening I saw that Lick had had this amazing relationship with-a hundred? Two hundred? I’m not sure even how to count them. And everybody he touched felt that he was their hero, and that he had been an extraordinarily important person in their life.”

Keep Reading

Most Popular

What is AI?

Everyone thinks they know but no one can agree. And that’s a problem.

What are AI agents? 

The next big thing is AI tools that can do more complex tasks. Here’s how they will work.

What’s next for bird flu vaccines

If we want our vaccine production process to be more robust and faster, we’ll have to stop relying on chicken eggs.

How to use AI to plan your next vacation

AI tools can be useful for everything from booking flights to translating menus.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.