I once wrote on this page, “Science fiction is to technology as romance novels are to marriage: a form of propaganda” (see “Against Transcendence,” February 2005).
This represents my sincere view, but stated so baldly, without elaboration, the remark implies a contempt I do not feel. For I adore science fiction. If it is propaganda, I am its happy dupe; and if I am a technology editor and journalist today, it is because between the ages of seven and fourteen, I read little but science fiction.
I grew up on a farm on the North Coast of California that had at one time been a kind of hippie commune. Around the various cabins on the property were dozens of yellowed paperbacks of the sort that the counterculture loved; and when I recall my childhood all at once, it is perpetually summer, and I am alone in a field or a tree house, reading Alfred Bester, Algis Budrys, Samuel R. Delany, Philip K. Dick, or Robert Heinlein.
I grew out of science fiction–which is to say that I learned to enjoy other, more literary writing and to disguise my passionate fandom. But science fiction continues to influence me. To this day, my tastes and choices as an editor and journalist are bluntly science fictional: I look for technologies that are in themselves ingenious and that have the potential to change our established ways of doing things. Best of all, I like technologies that expand our sense of what it might mean to be human.
Watch the Editor's video.
In this, I believe, I am an entirely conventional technologist. Most of us came to technology through science fiction; our imaginations remain secretly moved by science-fictional ideas. Only the very exalted are honest about their debt. In his collection of lectures on the future of technology, Imagined Worlds, the great theoretical physicist Freeman Dyson writes, “Science is my territory, but science fiction is the landscape of my dreams.”
Science fiction’s influence on technologists’ imaginations can be observed in its successful and unsuccessful predictions. Discerning a causal relationship between what science fiction has predicted and what technologists have created might be an instance of the logical fallacy post hoc ergo propter hoc (“after this, therefore because of this”), except for a curious fact: SF writers not only describe current research and extrapolate its likely development but also prescribe cool things that enthralled technologists later make or try to make. In short, life imitates art.
Fans decry any emphasis on their favored genre’s predictive power (science fiction, they say, is really about the present day); but nonetheless, the accurate predictions of many science fiction writers are justly famous. Geostationary telecommunications satellites were first proposed by Arthur C. Clarke in a paper titled “Extra-Terrestrial Relays: Can Rocket Stations Give World-Wide Radio Coverage?” published in Wireless World in October 1945. Space travel has been a staple of science fiction since Jules Verne published De la Terre à la Lune in 1865. Robots first appeared in Karel C˘apek’s play R.U.R. in 1921. Indeed, it is more useful to ask, What hasn’t SF predicted?
But the prescriptive power of science fiction has functioned both positively and negatively. Older computer scientists and electrical engineers such as Marvin Minsky and Seymour Cray, born in the mid-1920s, pursued a vision of humanlike artificial intelligence and mainframe computing popularized by science fiction after World War II (see Isaac Asimov’s “Multivac” stories). These scientists remained committed to the glamour of big computing long after research suggested that it would not soon produce the thinking machine for which they pined. Here, science fiction’s predictions were wrong, but still influential.
By contrast, consider the influence of science fiction on the development of the personal computer and the Internet. It is often said that SF missed both, but that isn’t really true. The “cyberpunks” and their precursors began dreaming of the Net in the late 1970s. Algis Budrys’s highly literate 1977 novel, Michaelmas, describes a worldwide web of telecommunications and computer data. Vernor Vinge, in 1981’s True Names,anticipated a cyberspace that is recognizably our own. Most notably, William Gibson invented the “consensual hallucination” of the Matrix in Neuromancer, published in 1984. These fictions were greatly influential on younger technologists, such as Tim Berners-Lee and Jaron Lanier. The Web would not be the demotic, freewheeling society it is without the cyberpunks.
One can go further. In his survey of science fiction, The Dreams Our Stuff Is Made Of: How Science Fiction Conquered the World, Thomas M. Disch writes, “It is my contention that some of the most remarkable features of the present historical moment have their roots in a way of thinking that we have learned from science fiction.” I think he’s right, and so we’re publishing some science fiction of our own: a story by David Marusek, author of the acclaimed 2005 novel Counting Heads (see “Osama Phone Home”).
Write and tell me what you think at email@example.com.
Why China is still obsessed with disinfecting everything
Most public health bodies dealing with covid have long since moved on from the idea of surface transmission. China’s didn’t—and that helps it control the narrative about the disease’s origins and danger.
These materials were meant to revolutionize the solar industry. Why hasn’t it happened?
Perovskites are promising, but real-world conditions have held them back.
Anti-aging drugs are being tested as a way to treat covid
Drugs that rejuvenate our immune systems and make us biologically younger could help protect us from the disease’s worst effects.
A quick guide to the most important AI law you’ve never heard of
The European Union is planning new legislation aimed at curbing the worst harms associated with artificial intelligence.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.