The General of Physics
In 1931, after three years of effort, some MIT electrical engineers completed an elaborate room-sized machine that used electric motors to move gears, rods, and 18 shafts. The “differential analyzer” could automatically solve differential equations, a fundamental application of calculus. A practical boon for engineers, it represented “the first of the great family of … computers,” as its inventor, Vannevar Bush, EGD ‘16, described it.
This widely hailed machine might have been the achievement of a lifetime. But Bush’s career continued its upward trajectory: the inventor, engineer, and professor would become dean of MIT’s School of Engineering, director of U.S. scientific research during World War II, and architect of the country’s scientific establishment. As an engineer, he inspired the inventors who pioneered personal computing and the Internet; as an administrator, he gave America’s scientific research the wartime push it needed. Indeed, in January 1942, as Japan launched a new series of attacks in the Pacific and German submarines started a new wave of strikes in the Atlantic, Collier’s magazine called Bush “the man who may win or lose the war.”
The Collier’s story would prove prophetic. As head of President Franklin Roosevelt’s newly created Office of Scientific Research and Development (OSRD), Bush oversaw the development of radar and accelerated the Manhattan Project to produce the first atomic bombs. Some scholars have called him “the first presidential science advisor,” though this role did not formally exist then. Time magazine, in a 1944 cover story, dubbed him the “General of Physics.” By war’s end, Bush had also created a road map for America’s future with his report Science: The Endless Frontier, which advocated long-term government funding of academic research. “No American has had greater influence in the growth of science and technology than Vannevar Bush,” former MIT president Jerome B. Wiesner, HM ‘71, once wrote.
Vannevar (rhymes with “beaver”) Bush was born in 1890 in Everett, Massachusetts, the son of a pastor who supposedly named him after a colleague called John Van Nevar. He received his bachelor’s from Tufts and his doctorate in engineering, which he completed in 1916 after just one year, jointly from MIT and Harvard. He joined MIT in 1919 as an associate professor of power transmission and soon became an influential figure. Directing graduate study in electrical engineering, Bush increased the number of people earning master’s degrees in the department from four in 1921 to 45 in 1923.
Bush could intimidate colleagues and sometimes played hardball. He nursed a grudge against his thesis advisor, Arthur Kennelly, who had made it difficult to complete his doctorate in a year; when Kennelly retired, Bush was supposed to find an office for him worthy of an emeritus professor, but instead he placed Kennelly’s name outside a room of switchboard operators.
When he became dean of the School of Engineering, in 1932, his tough edge formed a foil to MIT president Karl Compton’s more courteous ways. “I’m the ‘yes’ man and Bush is the ‘no’ man, and you’ll have to see Bush too,” Compton once told a faculty member who was seeking funding. Many colleagues found Bush friendly, however. The MIT physicist Philip Morse recalled that he would often encounter Bush in his office, “leaning back in his chair, his feet on his desk, interspersing puffs of smoke from his eternal pipe with bits of dry humor or laconic wisdom, spoken in his Yankee twang.”
Bush expressed a romantic view of his work, once writing that “he who struggles with joy in his heart struggles the more keenly because of that joy.” And his struggles bore great fruit. In addition to the differential analyzer, he created a voice-activated typing machine and a wearable camera, and he helped found the American Appliance Company (soon to be renamed Raytheon Manufacturing), which made components to improve radios. As an MIT professor, Bush taught a remarkable collection of students, including Claude Shannon, SM ‘40, PhD ‘40, the pioneer of information theory, and Frederick Terman, ScD ‘24, who helped develop Silicon Valley as a technology hub.
In 1938, Bush left MIT to head the Carnegie Institution of Washington, then one of the biggest foundations supporting American science. Soon he approached Roosevelt about improving U.S. scientific research. “Bush realized what universities had to offer in a way that nobody else realized at the time, especially concerning defense,” says David A. Mindell, director of MIT’s Program in Science, Technology, and Society and author of the book Between Human and Machine.
Bush’s proposal for a new science research agency was approved after a single 15-minute meeting with Roosevelt, who placed one simple annotation on the memo Bush had prepared for the talk: “O.K.—FDR.” Yet shepherding the development of a better radar system and the atomic bomb involved years of struggles with Washington’s military bureaucracy; Bush had to battle to put his agency, known as the National Defense Research Committee before it became the OSRD, on equal terms with the army and navy in shaping war strategy. To succeed, he doggedly formed good working relationships with Roosevelt and his close aide Harry Hopkins, and with Roosevelt’s successor, Harry Truman. As Bush recounted later, Truman once remarked, “Van, you should be a politician.” Bush replied, “Mr. President, what the hell do you think I’ve been doing around this town for five or six years?”
As the war drew to a close, Bush noted in The Endless Frontier that though science provided “much of our hope for the future,” we had “no national policy for science.” His report eventually led to the creation of the National Science Foundation (NSF) under Truman in 1950.
Bush was not chosen to head the NSF, however, and it became an agency very different from what the OSRD had been. As Mindell points out, Bush ran the OSRD as a close-knit organization where a relatively small number of program managers made decisions that often involved people they already knew well. By contrast, the NSF was subject to congressional oversight and distributed grants on the basis of peer review by a large group of scientists.
“Bush was an elitist in some ways that worked well,” says Mindell. “He felt that scientists saw the landscape better than other people and should be able to parcel out resources accordingly.” That approach was effective during World War II, Mindell says, but “by 1947, his vision had to compromise with the needs of a democracy.” Bush returned to his duties at the Carnegie Institution and then served as chairman and honorary chairman of the MIT Corporation. He died in 1974.
In recent years, the growth of computing has focused attention on Bush’s work in information processing and storage. Especially prophetic was “As We May Think,” a 1945 essay in the Atlantic Monthly describing his vision for a microfilm-based device called the Memex, “in which an individual stores his books, records and communications, and which is mechanized so that it may be consulted with exceeding speed.” It would be, he wrote, “an enlarged supplement to … memory.”
Such work would have a profound influence on personal-computing pioneers like Douglas Engelbart, the inventor of the mouse and an early version of hypertext, who foresaw that networked computing would let people work together at a distance to solve otherwise intractable problems. He and others have cited the Memex as a source of inspiration for the idea that we could store, arrange, and link pieces of digital information in logical ways.
Bush’s idea of a “memory structure” created relationships “in ways that linear paper couldn’t,” Engelbart has said, recalling that he was “thrilled” after reading “As We May Think.” In the age of the Internet, we use information technology in ways Bush never imagined. But his vision paved the way for these inventions, just as his scientific leadership helped win the biggest war ever fought.
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
The inside story of how ChatGPT was built from the people who made it
Exclusive conversations that take us behind the scenes of a cultural phenomenon.
Design thinking was supposed to fix the world. Where did it go wrong?
An approach that promised to democratize design may have done the opposite.
Sam Altman invested $180 million into a company trying to delay death
Can anti-aging breakthroughs add 10 healthy years to the human life span? The CEO of OpenAI is paying to find out.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.