Skip to Content
Opinion

The dangerous appeal of technology-driven futures

Technology doesn’t rule us. We direct it, but often by inaction.

Selman Design

Love it or hate it, technology enthralls us with the promise of change. Sometimes it’s the presumed benefits that grab our attention: curing disease, replacing fossil fuels, increasing food supplies, unlocking the secrets of the deep sea, colonizing Mars, or ending the ravages of old age. Other times the risks loom larger. What if we unleash a killer virus, set in motion a nuclear doomsday, block out harmful solar radiation with chemicals that prove toxic, or build computers that decide humans are dispensable? 

The battle between light and dark in the way we imagine technological change is ancient. In Greek mythology, Prometheus suffered agonies for bringing fire to Earth, and Daedalus lost his son to the urge to fly to freedom. But the most optimistic and most pessimistic views of technology both rely on a common misconception: that a technological pathway, once embarked upon, leads to inevitable social consequences, whether utopian or dystopian. 

This view, known as technological determinism, is historically flawed, politically dangerous, and ethically questionable. To achieve progress, societies like ours need a more dynamic understanding of why technology changes, how we change with it, and how we might govern our powerful, marvelous machines.

Technology is not an autonomous force independent of society, nor are the directions of technological change fixed by nature. Technology at its most basic is toolmaking. Insisting that technological advances are inevitable keeps us from acknowledging the disparities of wealth and power that drive innovation for good or ill. 

Technology is always a collective venture. It is what it is because many people imagined it, labored for it, took risks with it, standardized and regulated it, vanquished competitors, and made markets to advance their visions. If we treat technology as self-directed, we overlook all these interlocking contributions, and we risk distributing the rewards of invention unfairly. Today, an executive officer of a successful biotech company can sell stock worth millions of dollars, while those who clean the lab or volunteer for clinical trials gain very little. Ignoring the unequal social arrangements that produced inventions tends to reproduce those same inequalities in the distribution of benefits.

Throughout human history, the desire for economic gain has underwritten the search for new tools and instruments—in fields like mining, fishing, agriculture, and recently gene prospecting. These tools open up new markets and new ways to extract resources, but what the innovator sees as progress often brings unwanted change to communities colonized by imported technologies and their makers’ ambitions. 

The story of the internet shows that modern societies are often better at imagining the upsides of technology than its downsides.

For example, in West Bengal, where I was born, weavers lost such skills as making the intricate narrative motifs of the Baluchari sari during 200 years of British rule. Indeed, Britain’s first industrial revolution, which introduced the power loom in cities like Lancaster but adopted punitive tariffs to keep out hand-loomed cloth from India, was also a story about dismantling Bengal’s once-flourishing textile industry. Lost arts had to be regained after the British left. The cost of a radical break with a nation’s own economic and cultural heritage is incalculable.

The desire for military advantage is another driver of technological change that can, in some instances, benefit civil society—but “dual use” technologies often retain ties to forces that prompted their development. Nuclear energy, a spinoff from the pursuit of the atomic bomb, was sold to the world by US President Dwight Eisenhower as “atoms for peace.” Yet nuclear power remains closely tied to the threat of nuclear weapons proliferation. 

Similarly, the internet and world wide web, which revolutionized how much of the world lives today, owe much to the US Defense Department’s vision of a network of computers. First celebrated as a space for emancipation, the digital world has slowly revealed its antidemocratic features: constant surveillance, cybersecurity threats, the lawlessness of the dark web, and the spread of misinformation. More public awareness of the internet’s origins might have led to a more accountable cyberworld than the one designed by hotshot technologists. 

The story of the internet shows that modern societies are often better at imagining the upsides of technology than its downsides. But the trajectory of innovation is also guided by more subtle cultural preferences, often with profound consequences. 

In US biomedicine, for example, energy, attention, and money tend to be directed to high-impact, silver-bullet solutions, or “moonshots,” rather than to messier changes in the social infrastructures that give rise to many health problems. 

This inclination is reflected in Congress’s decision to authorize $10 billion for Operation Warp Speed to bring a covid-19 vaccine quickly to market. Moderna owes much of its success as a vaccine manufacturer to that massive public spending, and both Moderna and Pfizer have benefited hugely from lucrative supply contracts with the US government. 

At the same time, about a third of all US deaths from the pandemic occurred in nursing homes, a result of decades of underinvestment in the unglamorous social practices of elder care. Collectively, we chose to ignore the plight of the vulnerable elderly, and spent big on technology only when everyone was at risk.

Change may not be inevitable, but economists have a point when they talk about “path dependency,” or the notion that once an engine gets going it’s bound to follow an existing track. Sunk costs—foundations laid, machinery ordered, workforces trained—cannot be recovered. It often seems easier to go where the flows of materials and social practices have already cut deep channels. It’s not surprising, then, that defense spending has proved to be one of the prime motivators of innovation, even though such investments perpetuate power imbalances and seldom respect cultural or ethical sensitivities. 

In his famous poem “The Road Not Taken,” Robert Frost reflects on how the human mind constructs narratives of inevitability. We come to a fork in the road, we choose a path, and then as memory plays its tricks we come to see that choice as shaping all that came after. Faced with mounting problems of inequality, diminishing resources, and a looming climate calamity, we must learn to recognize the flaws in such linear storytelling, and to imagine the future along as-yet-untraveled pathways of change.

Sheila Jasanoff is professor of science and technology studies at the Harvard Kennedy School.

Deep Dive

Humans and technology

Building a more reliable supply chain

Rapidly advancing technologies are building the modern supply chain, making transparent, collaborative, and data-driven systems a reality.

Building a data-driven health-care ecosystem

Harnessing data to improve the equity, affordability, and quality of the health care system.

Let’s not make the same mistakes with AI that we made with social media

Social media’s unregulated evolution over the past decade holds a lot of lessons that apply directly to AI companies and technologies.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.