For decades, the U.S. military has been working to move information ever more swiftly from “sensor to shooter.” The more quickly targeting data is relayed, it is thought, the more likely our weapons are to hit the enemy. Making this process smoother and faster has been a fundamental aim of the advocates of “network-centric warfare,” a concept many consider crucial to the “revolution in military affairs.”
This movement, championed most ardently by the late Vice Admiral Arthur Cebrowski, who was director of the Office of Force Transformation in the U.S. Department of Defense from 2001 to 2005, has attracted support well beyond his acolytes. That’s because warfare has shifted away from massed, set-piece battles between similar forces. Instead, nearly all conflicts since the end of the Cold War can be described in terms of swarming “hiders and finders.” Combatants stay hidden, pop up to strike, and then disappear until they attack again.
This change has been most apparent in the rise of anticolonial guerrilla wars over the past half-century. In these conflicts, insurgents remain hidden–swimming in “the sea of the people,” as Mao put it. Conventional militaries have lost most of their wars against such enemies.
Today, this hider-and-finder dynamic has become even more dominant, and the conflicts in Iraq and Afghanistan, as well as the global hunt for al-Qaeda operatives, place an extraordinary premium on knowing where the enemy is and what he is doing. This realization has spurred increased emphasis on rapid collection and dissemination of timely, targeted information about the enemy (see “A Technology Surges”).
The years since September 11, 2001, have seen remarkable technical advances in information systems, from sensors and communications links to weapons-guidance packages. For example, I participated in creating a “surveillance and target acquisition network” that allowed real-time sharing of voice, video, and text between ground forces, pilots, and unmanned aerial vehicles like the Predator. Today, new systems are being fielded to allow soldiers to enter data on the spot–even during battle.
These technologies are wonders, but generally they have not been accompanied by shifts in military doctrine and organization. The result: a tidal wave of data is being created that can swamp systems still organized around large units (such as army divisions, naval strike groups, or air force wings) whose goal is to apply “overwhelming force” at some mythical “decisive point.” Generally speaking, these large units cannot quickly disseminate the information they collect throughout their networks and then allow smaller constituent parts to swarm against insurgents.
This disjunction between technology and organization was one reason we floundered in Iraq from 2003 to 2006. But a shift began last year, away from big units on supersized forward operating bases to a network of small outposts. The latter’s tiny but well-informed garrisons put a dent in the insurgency with a multitude of small-scale swarming raids on terrorist cells. At last, tactics and organizations had emerged to exploit the possibilities implied by advanced technological functions.
But this process has indeed only just begun. We face the risks of overemphasizing technology and leaving the hard-learned lessons of Iraq behind–just as our knowledge of unconventional warfare withered after Vietnam because we preferred to prepare for large, set-piece battles. Then, we fell in thrall to the allure of precision-guided weapons systems. Now a similar enchantment accompanies a range of information technologies. It is a spell that can be broken only by remembering that new organizational forms and practices must develop along with new tools.
John Arquilla is professor of defense analysis at the Naval Postgraduate School. His next book, Worst Enemy: The Reluctant Transformation of the American Military, will be published in April.
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today