Messages move at light speed. Maps speak directions. Groceries arrive at the door. Floors mop themselves. Automation provides irresistible conveniences.
And yet automation can also be cast as a villain. When machines take over work that once required sweat and skill, humans atrophy into mere button-pushing operators. Laments about automation are as familiar as John Henry, the railroad steel-driver of lore who could not outlast a steam-powered version of himself. The latest is The Glass Cage by Nicholas Carr, who worries about the implications as machines and software advance far past the railroad and the assembly line to the cockpit, the courtroom, and even the battlefield. Machines and computers now do much more than rote mechanical work. They monitor complex systems, synthesize data, learn from experience, and make fine-grained, split-second judgments.
What will be left for us to do? While economists and policy makers are debating what automation will mean for employment and inequality (see “How Technology Is Destroying Jobs,” July/August 2013), Carr’s book does not sort out those implications. It is about what he fears will be diminished—our autonomy, our feelings of accomplishment, our engagement with the world—if we no longer have to carry out as many difficult tasks, whether at home or at work.
The Glass Cage: Automation and Us
By Nicholas Carr
Flow: The Psychology of Optimal Experience
By Mihaly Csikszentmihalyi
Harper Perennial, 1990
By Donald Hall
Beacon Press, 1993
The centerpiece of his argument is the Yerkes-Dodson curve, which plots the relationship between human performance and the stimulation our tasks provide. Too much stimulation makes us feel panicked and overloaded, but when we have too little stimulation—when our work is too easy—we become lethargic and withdrawn. Activities that provide moderate stimulation yield the highest level of performance and, as Carr argues, turn us into better people in the process.
Carr, a former executive editor of Harvard Business Review and an occasional contributor to this magazine, has written several books that have challenged common beliefs about technology, like the added value of IT for businesses and the cognitive benefits of Google. In The Glass Cage he is channeling the anxieties of the contemporary workplace. Even talented white-collar workers feel as though they are half a generation from being rendered obsolete by an algorithm. But Carr is not analyzing the economic consequences of automation for the workforce at large. The book begins with a warning to airline pilots from the U.S. Federal Aviation Administration not to rely too much on autopilot. He narrates two crashes, tracing their cause to pilot inattention caused by the autopilot’s lulling effects. This reads like the opening of a utilitarian argument against automation: we ought to let pilots do their jobs because computers lack the judgment necessary to preserve human life during moments of crisis. Later, we learn that the safety records of Airbus planes and the more pilot-oriented planes built by Boeing are more or less identical. Carr’s core complaint is mainly about the texture of living in an automated world—how it affects us at a personal level.
At times, this seems to be coming from a position of nostalgia, a longing for a past that is perhaps more desirable in retrospect. Take GPS. To Carr, GPS systems are inferior to paper maps because they make navigation too easy—they weaken our own navigational skills. GPS is “not designed to deepen our involvement with our surroundings,” he writes. The problem is, neither are maps. Like GPS, they are tools intended to deliver their user to a desired destination with the least possible hassle. It is true that paper maps require a different set of skills, and anyone who finds this experience of stopping and unfolding and getting lost more enlivening or less emasculating than the new incarnation of way-finding can choose to turn GPS off, or use the two technologies in tandem.
In the zone
The classic account of life at the top of the Yerkes-Dodson curve is Mihaly Csikszentmihalyi’s Flow: The Psychology of Optimal Experience, published in 1990. Flow is a concept of almost poetic vagueness, hard to measure and even harder to define. Csikszentmihalyi found it in all kinds of people: athletes, artists, musicians, and craftsmen. What makes “flow” more than a flight of fancy is that almost anyone will recognize the feeling of “losing oneself” in a challenging task or being “in the zone.” As a concept, flow erases the boundary that economists draw between “work” and leisure or recreation, and Carr wants automation to be designed to produce it. Ideally it would have a Goldilocks just-right quality, relieving drudgery but stopping short of doing everything.
Like Carr, Csikszentmihalyi valorized physical work—which might be easier to do if you don’t rely on it for subsistence or a paycheck—and fretted that automation would deny laborers the chance to achieve flow. “The typical laborer now sits in front of a bank of dials, supervising a computer screen in a pleasant control room, while a band of savvy robots down the line do whatever ‘real’ work needs to be done,” he wrote. Most people, he observed, now did “jobs that would surely appear like pampered leisure to the farmers and factory workers of only a few generations ago.”
But surely the opportunities for flow or fulfillment depend more on your approach to your work than on the selection of the right technologies. The poet Donald Hall wrote in Life Work, his 1993 memoir, that even though his deskbound life was easier than those of his grandparents, New Hampshire farmers who split wood, baled hay, milked cattle, and canned vegetables, both his work and theirs gave rise to daily routines, which lead to “absorbedness,” a flow-like state in which the hardness of hard work is dissolved by skill and habit. The conversation around what makes for “hard work” may tend to refer to physical labor because it is an easier process to narrate and observe. Hall suggests that the rhythms of work performed at a desk might not be so different, and that it is intention and discipline—not technology or other material conditions—that define the work we do. (Then again, he chose to write in longhand rather than use a typewriter or a computer. And he was writing about writing before the distractions of e-mail and Twitter.)
Carr spends most of The Glass Cage treating automation as though it were a problem of unenlightened personal choices—suggesting that we should often opt out of technologies like GPS in favor of manual alternatives. Yet the decision to adopt many other innovations is not always so voluntary. There is often something seductive and even coercive about them. Consider a technology that Carr himself discusses: Facebook, which seeks to automate the management of human relationships. Once the majority has accepted the site’s addictive design and slight utility, it gets harder for any one individual to opt out. (Though Facebook may not look like an example of automation, it is indeed work in disguise. The workers—or “users”—are not paid a wage and the product, personal data, is not sold in a visible or public market, but it does have a residual echo of the machine room. Personal expression and relationships constitute the raw material; the continuously updated feed is the production line.)
Carr flirts with real anger in The Glass Cage, but he doesn’t go far enough in exploring more constructive pushback to automation. The resistance he endorses is the docile, individualized resistance of the consumer—a photographer who shoots on film, an architect who brainstorms on paper. These are small, personal choices with few broader consequences. The frustrations that Carr diagnoses—the longing for an older world, or a different world, or technologies that embody more humanistic and less exploitative intentions—are widespread. For these alternatives to appear feasible, someone must do the hard work of imagining what they would look like.
Mattathias Schwartz is a freelance writer and regular contributor to the New Yorker. His last piece for MIT Technology Review was “Fire in the Library” (January/February 2012).
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
How to fix the internet
If we want online discourse to improve, we need to move beyond the big platforms.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.