September 11, when terrorists forcibly diverted two airline flights into the twin towers of the World Trade Center and a third plowed into the Pentagon, stunned surprise and inconsolable grief could be our only initial response. Then came an apprehension that will long be with us: how many other terrorist cells are still out there, and will we be able to find them in time?
But to many of those who have followed the scientific and technical side of warfare and terrorism, there was yet another jolt. It was comparable to the horror of the military analysts in December 1941 who had been expecting a Japanese preemptive strike in the Philippines or elsewhere in Asia, but not at Pearl Harbor. Assumptions were fatally wrong. Things were not supposed to be this way. We faced an old nightmare, not the futuristic dystopia of information warfare and massive chemical or biological attack that we had dreaded.
In the 1990s, as advanced systems triumphed in the Gulf War and the Nasdaq index began to soar, conflict was supposed to be going high tech. In December 1995, for example, a dozen Marine Corps generals and colonels, including the commandant, General Charles Krulak, visited the World Trade Center. They were studying how to master information overload by observing some of the top traders of the New York Mercantile Exchange practicing simulated commodity activity. Later, they conducted simulated combat exercises with 15 traders at advanced workstations on Governor’s Island off the southern tip of Manhattan. How could the images on those 69-centimeter monitors have warned them that less than six years later, the Twin Towers would become the battleground of a domestically launched air war?
Of course we feared attacks from the Middle East and elsewhere in the late 1990s. But the bad guys, we thought, were getting online, just like us. As the year 2000 approached, military and civilian authorities were on high alert, not only for accidental failures of vital systems but for cyberattacks using the date change as a smoke screen. Yet the nation’s pipelines and electrical grids survived the new year without incident. Even the powerful anti-U.S. emotions of the Kosovo war produced no serious assault on the U.S. infrastructure. Only too late did we realize what a cataclysm had been in preparation.
Our tragic mistake was not that we pursued the new. It was that we neglected the old. And it’s a pattern that could have troubling implications if we don’t recognize its applicability to other key parts of our technological culture.
In the case of the September 11 attacks, as journalists soon realized, the terrorists’ methods were surprisingly low tech. In fact, the technologies involved had been established for a generation-30 years, plus or minus five.
While the building design was tested to withstand a hit from a 707 jet, the Boeing 747, with its immense fuel loads, was already in service by 1969. The terrorists also apparently needed no sophisticated knowledge of automatic pilots and global positioning satellites. They had simply, and all too well, learned the classic principles of flying.
The immediate goal of the hijackers was also a 1970s concept: stunning the world with photogenic violence, as at the 1972 Munich Olympics. Thanks to satellite feeds, cheap color televisions and the Internet, these images could now have a more rapid and vivid impact, but the principle was old hat. So was the idea, dating at least from the time of the Ho Chi Minh sandal-carved by resourceful Viet Cong soldiers from rubber tire segments-that the improvised technologies of poor countries and peoples might humiliate the West. The terrorists understood all too well this neglected feature of technology: with enough determination, practice and time, mature and even seemingly outdated tactics and devices can be reborn.
What can halt future attacks? The events showed the limits of communications monitoring and satellite surveillance. The question remains whether more ambitious programs like the FBI’s troubled Carnivore e-mail-sniffing technology or facial recognition software will unearth new data on terrorist activity, or simply compound the familiar problem of information overload and produce an illusion of control. The frequent false alarms from even the simplest home security systems are already a plague for the police.
We obviously need to think more about protection from both newer and older forms of attack. One common feature of both is reliance on personal networks. The terrorist cells’ apparent methods of recruiting from the same regions, clans and families, and moving frequently from base to base, make them difficult to infiltrate conventionally-but they also reveal patterns to experienced analysts, making more targeted technical surveillance possible. We don’t need another decimal place of accuracy from computational social-science studies but a better intuitive understanding of the terrorists and their civilian neighbors. At the same time, the tacit knowledge possessed by the most effective police officers and detectives deserves more respect. One of our great challenges will be to formalize and teach these elusive skills to security screeners at airports and elsewhere.
But the shock of the old is not limited to breaches of national security. The civil engineer and historian Henry Petroski, in his book Engineers of Dreams: Great Bridge Builders and the Spanning of America, points to a 30-year cycle in which a new generation of professionals forgets the hard-won lessons of its predecessors’ errors. Indeed, there are signs that many professions have started to lose their technological balance. Many U.S. medical residents, for example, are no longer highly skilled in using a stethoscope to interpret body sounds. The demands of training physicians for tomorrow’s biotechnology may be in conflict with the best preparation for hands-on contact with today’s patients. Doctors obviously need to know the latest science, but both educational trends and the pressures of managed care make it harder for them to read facial expressions alongside lab reports.
What makes a good lawyer, too, is not just access to databases of legislation and decisions but intuitive knowledge of clients and clients’ environments. That’s why most lawyers still avoid representing themselves despite all the new tools at their disposal. They’re paying not for formal information but for tacit knowledge.
Librarians tell me that students often spend much more time finding certain information on the Web than they would have needed using standard printed reference books. Internet skills are indispensable-in fact, they too are not taught enough-but so is the ability to access the vast body of essential knowledge that has not been and may never be available in an electronic format. The high cost of both electronic and paper information, not to mention terminals and printers, challenges librarians, but most of them recognize that each mode has irreplaceable advantages.
In fact, engineering itself is not just the application of mathematical equations but a subtle balance of aesthetics, economics and science in which culture counts as much as calculation. Computer-assisted design can accelerate execution of ideas but can never replace the insight that comes from immersion in the traditions of building. It was the cultural resonance of towers and polygons, used by brilliant designers, that made the targets of September 11 such powerful icons, not simply their acres of usable space.
Just as the Nasdaq’s collapse well before September 11 was a symptom of an economy out of balance, so this infinitely greater catastrophe reminds us to seek a new equilibrium between virtual reality and the real kind, between pixels and iron and concrete, flesh and blood. As Dan Rather told his viewers during the ordeal, “This is not a graphic.”
Embracing CX in the metaverse
More than just meeting customers where they are, the metaverse offers opportunities to transform customer experience.
Identity protection is key to metaverse innovation
As immersive experiences in the metaverse become more sophisticated, so does the threat landscape.
The modern enterprise imaging and data value chain
For both patients and providers, intelligent, interoperable, and open workflow solutions will make all the difference.
Scientists have created synthetic mouse embryos with developed brains
The stem-cell-derived embryos could shed new light on the earliest stages of human pregnancy.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.