Every so often, I ask my daughter about the future. When she was three, she had only a basic concept of time, with little awareness of clocks or calendars. She could understand The Very Hungry Caterpillar, a classic children’s book about a creature gorging on food over a week, but when she would tell the story back to me, she would mix up the days. Time, for her, was disordered. By the age of five, however, she had figured out how yesterday trailed behind her and tomorrow extended in front. At breakfast one day, I asked her how far into the future she could imagine. “When I am 10,” she replied. Tomorrow existed for her, it seemed, but went dark five years ahead.
She’s now seven. Recently, I asked how frequently she thinks about the future.
“Not often,” she said. “But sometimes I worry what will happen.”
“What do you worry about?”
“Getting hurt, or getting arrested or something.”
“Can you imagine being the same age as me and Mum?”
“Can you imagine being a teenager?”
“Can you imagine having your own children?”
“That freaks me out.”
The older she gets, the more she populates the years to come in her imagination. Culture fills much of that canvas, and I’ve often no idea where she picks it up.
“The Singulation,” she explained to me recently, “is where people are miserable in the future. And a person says ‘What’s the point?’ The robots take over the Earth.”
“Wait, are you talking about the Singularity? Where did you learn that?!”
The cartoon Captain Underpants, she said.
Just as children expand their temporal perceptions as they age, so too has our species over millennia. Like toddlers, our pre-human ancestors had no sense of a distant future. They lived only in the present. Humanity’s trajectory from tool-wielding hominins to the architects of grand metropolises has been interwoven with our ever-expanding sense of time. Unlike other animals, we have minds capable of imagining a deep future, and we can conceive the daunting truth that our lifetime is a mere flash in an unfathomable chronology.
Yet while we may have this ability, it is rarely deployed in daily life. If our descendants were to diagnose the ills of 21st-century civilization, they would observe a dangerous short-termism: a collective failure to escape the present moment and look further ahead. The world is saturated in information, and standards of living have never been higher, but so often it’s a struggle to see beyond the next news cycle, political term, or business quarter.
How to explain this contradiction? Why have we come to be so stuck in the “now”?
The future isn’t what it used to be
Being able to conceptually manipulate timemay be what set us apart from other animals. In the Pleistocene, our ancestors developed what evolutionary biologists call “mental time travel.” We can build theaters in our minds that allow us to play out scenes and characters from the past, as well as hypothetical stories about the future.
Yet while early humans had this talent, their concept of a deeper future was rudimentary. In Western thought, this was the case until at least the Middle Ages. For centuries, a cyclical view of time dominated, a view of seasons and kingdoms. Beyond those time frames, perhaps the only major change expected in the future came from religious teachings: the apocalypse. Until then, though, there was only an extended present. “In medieval times, most human affairs had the form of endless repetition: sowing and harvesting, disease and health, war and peace, the rise and fall of kingdoms—there was little reason to believe in long-term change or even improvement in human affairs,” wrote Lucian Hölscher, a historian at the University of Bochum, in a 2018 essay. “The long-term future, at least in this world, did not exist. Rather people lived in something of an extended present.”
These long-term risks make it increasingly important to extend our perspective beyond our own lifetimes; our actions are rippling further into the future than ever before.
Even the medieval builders of cathedrals—often lauded as examples of long-term thinking for creating structures that would last generations—were not imagining radically different futures with any great degree of foresight. The world of tomorrow they pictured was the same as theirs, constant and known. (Also, it should be noted that some cathedrals collapsed as a result of short-sighted workmanship. A prayer was said during services: “Deare Lord, support our roof this night, that it may in no wise fall upon us and styfle us. Amen.”)
In the West, a deeper sense of time didn’t emerge until the 18th century. In the 1700s, geologist James Hutton showed how the chronology written into Scottish rocks extended millions of years into the past. The philosopher Immanuel Kant wrote that there would be “millions and millions of centuries, in which new worlds and world orders will be generated,” adding: “Creation is never finished. It once had a beginning, but it will never end.” And writers began dreaming of futuristic worlds. In 1770, Louis Mercier published L’An 2440, a utopian novel about a man who wakes up in an idealized Paris of the 25th century. The book was banned by the Catholic church: in Spain, the king supposedly burned it himself.
Over the next 200 years, this scientific and intellectual lengthening of the time span we could imagine paved the way for great strides in our understanding of ourselves and the planet. It allowed Darwin to propose his theory of evolution, geologists to carbon-date the true age of Earth, and physicists to simulate the expansion of the universe.
Our awareness of deep time was here to stay, but that’s not the same as paying attention to it. The 18th-century European contemplation of a long, bright future was not to last. Periodically, perspectives would shorten, often through crises such as the French Revolution. Hölscher argues that you can see this transformation in writing from the late 1700s into the dawn of the 1800s: optimistic, far-reaching predictions about the world gave way to more circumspect descriptions of the future, focused on next steps and nearer-term improvements in standards of living. A similar contraction, he contends, took place with World War I, following the hopeful future-gazing of the early 20th century.
According to historian François Hartog, the author of Regimes of Historicity, we are in the midst of another shortening right now. He argues that at some point between the late 1980s and the turn of the century, a convergence of societal trends took us into a new regime of time that he calls “presentism.” He defines it as “the sense that only the present exists, a present characterized at once by the tyranny of the instant and by the treadmill of an unending now.” In the 21st century, he writes, “the future is not a radiant horizon guiding our advancing steps, but rather a line of shadow drawing closer.”
On the scale of civilization, it is difficult to test empirically the assertions of those who say we are living in a short-termist age. Future historians may have a clearer view. But we can still perceive the lack of longer-term thinking from which our society suffers.
You can see it in business, where quarterly reporting encourages CEOs to prioritize short-term investor satisfaction over long-term prosperity. You can see it in populist politics, where leaders are more focused on the next election and the desires of their base than the long-term health of the nation. And you can see it in our collective failure to tackle long-term risks: climate change, pandemics, nuclear war, or antibiotic resistance.
These risks make it increasingly important to extend our perspective beyond our own lifetimes; our actions are rippling further into the future than ever before. But as the Oxford philosopher Toby Ord has argued, this power to shape the future is not yet matched by foresight or wisdom.
There may be multiple forces fostering a short-termist mindset in our age. Some point to that often-blamed scourge, the internet. Others lament the intersection of 24-hour news media and politics, which encourages decision-makers to focus more on headlines or polling than future generations. Hartog blames the capitalist, consumerist norms that came to dominate Western culture by the late 20th century. During this period, “technological progress kept forging ahead, and the consumer society grew and grew,” he writes, “and with it the category of the present, which this society targeted and, to an extent, appropriated as its particular trademark.”
As with many ailments, there is probably no single cause: rather, the convergence of many is responsible. But we need not despair. If this account is correct, then short-termism is an emergent property of the cultural, economic, and technological moment. It need not last forever, nor is it totally out of our control. The assumption that things must always stay the way they are today is actually itself a form of presentism. But if we understand some of the psychological pressures that nudge us toward short-termism in daily life, we can find ways to combat them.
During a recentfellowship at MIT, I investigated how our psychological experience of the future can change. I was curious about what role the far future plays in our day-to-day lives, if any. I also wanted to know what psychological pressures might cause us to lose sight of the long term in everyday decisions. I call these pressures “temporal stresses.”
Some themes surfaced again and again, to which I’ve given the convenient acronym SHORT:
S – Salience H – Habits O – Overload R – Responsibility T – Targets
First, salience. Striking, emotionally resonant events tend to dominate our thinking more than abstract happenings. It’s a facet of the “availability heuristic,” a cognitive bias that means people are more likely to imagine the future through the lens of recent events.
Entrenched yet invisible habits play a role here. It’s harder to overcome the shortening effects of salience when we are doomscrolling on our phones through political controversy, crime, culture wars, disasters, or attacks. These events, while important, populate our imaginings of the future to a disproportionate degree.
Short-termist behavior can also plague organizations. For example, the Boston-based think tank FCLT Global recently reviewed the habits of corporations and warned against letting board meetings focus on compliance instead of long-term strategy, or failing to tell shareholders about long-term plans. Business leaders who establish different habits—such as Jeff Bezos, who communicates Amazon’s long-term principles to shareholders regularly—can create a culture among employees and investors that fosters the longer view.
Compounding all this is the overload of a connected life. I needn’t dwell on the acceleration of technological change and its effect on the information ecosystem, but if you are looking for evidence, consider that it took 71 years for telephones to be adopted by half the US population. By contrast, cell phones took only 14 years to reach the same milestone. And the internet? A mere decade.
As technology’s pace accelerates, the concomitant quickening of life, work, and information has further overloaded our attention. Research conducted in 2005 suggested that people’s picture of the future goes “dark” around 15 to 20 years hence. As the cosmologist Martin Rees has pointed out, it’s difficult to be a “cathedral thinker” when the lives of our children promise to be so radically different from our own—a problem that our medieval ancestors simply did not have.
The accelerated nature of 21st-century life has also diluted responsibility for our actions. The modern world has made it ever easier to detach ourselves from consequences and accountability. Consider the hamburger. A single consumer in a complex global supply chain shares only a tiny portion of responsibility for the ills involved in getting that burger to the table: carbon emissions, factory farming, water pollution, and more.
Slow, creeping problems like global warming don’t pop up on the attentional radar until something is burning or flooding.
When communities were small, goods were local, and societal obligations were more tangible, things were different. Centuries ago, people didn’t have to think about the damage caused by industrial farming, nor about atomic waste, ocean plastics, atmospheric carbon, or the other malignant heirlooms for which we are collectively responsible but not individually culpable. (And even in that far simpler world, civilizations occasionally collapsed after exhausting their natural resources, among other wrong turns.) We need ways to make those responsibilities more visible—and, crucially, hold people accountable.
The final temporal stress—and this is a major one—is targets. Today, metrics dominate all realms of life. Growth statistics. Efficiency scores. Shareholder returns. KPIs, GDP, ROI. If poorly framed, these targets foster presentism or even encourage bad behavior.
The sociologist Robert Jackall described one scenario in which this happens regularly. He called it “milking the plant”: a manager would arrive at a plant or factory with an ambitious set of targets from the board, and immediately crack the whip. Productivity would rise accordingly. Months later, the targets would be hit, and the manager would be promoted or move on. Left behind, however, would be a mess: unhappy workers and machinery run into the ground. The next manager would have to pick up the pieces with a new set of short-term targets—and the cycle would repeat.
The problem with metrics is captured by Goodhart’s Law, named after a British economist, which is often phrased as: “When a measure becomes a target, it ceases to be a good measure.” To escape short-termism, we must reassess the targets by which we gauge success. Do they encourage longer-term thinking, or do they prioritize only present-day gains?
We might start by thinking about how companies can do more to balance year-on-year or quarterly targets against long-term aspirations that last—or even exceed—a lifetime, like the commitments some oil companies have made to reach net zero emissions. We already manage this on a personal level to some extent, through our career, education, or family goals. Some attempts are also being made in the political realm to define metrics that extend decades or centuries, such as the UN’s Sustainable Development Goals, parts of which have been absorbed into laws and company policies around the world. (Wales, for example, passed the Well-being of Future Generations Act: loosely based on the UN goals, it requires public bodies to factor certain long-term aims into their decision-making.)
Fighting temporal stresses might be a struggle, but the targets we choose are entirely up to us. To paraphrase that well-worn aphorism: you overestimate what you can achieve in a day, but underestimate what you can achieve in a century.
The hinge of history
Identifying the temporal stresses that promote short-termism in our lives is only a starting point. Our greatest challenge this century is to transform our relationship with time. History suggests that our horizons have shortened before—but they can expand again. During the pandemic, our “presentism” has become even more extreme, but cultural norms have been challenged too. There may never be a better time to ask what future we actually want.
Some suggest we may be living at the “hinge of history,” a time uniquely influential for the future of humanity. We have never had so many ways to destroy ourselves through self-made dangers, from nuclear weapons to bioterror pathogens. But if we can plot a way through this period by embracing the long term, goes the argument, then our species—like other mammals—has the potential to survive for millions of years.
If humanity’s evolving time perception does mirror that of a child like my daughter, then our temporal maturity as a species could be yet to come. Perhaps we are merely in a tumultuous period of adolescence, and age will bring a sense of a deeper future. Like teenagers confronted suddenly with the consequences of their actions, we are facing a crisis brought on by our short-termism. Let’s hope it turns out to be merely the shock we need in order to grow up.
Your daily newsletter about what’s up in emerging technology from MIT Technology Review.