Editor’s Note: This story relies upon anonymous sources who could not have spoken on the record without prosecution or other serious repercussions. The author revealed their identities to MIT Technology Review.
The unmanned aerial vehicle—the “drone,” the very emblem of American high-tech weaponry—started out as a toy, the fusion of a model airplane and a lawn-mower engine. While its original purpose was to bust up Soviet tanks in the first volleys of World War III, it has evolved into the favored technology for targeted assassinations in the global war on terror. Its use has sparked a great debate—at first within the most secret parts of the government, but in recent months among the general public—over the tactics, strategy, and morality not only of drone warfare but of modern warfare in general.
But before this debate can go much further—before Congress or other branches of government can lay down meaningful standards or ask pertinent questions—distinctions must be drawn, myths punctured, real issues teased out from misinformed or misleading distractions.
A little history is helpful. The drone as we know it today was the brainchild of John Stuart Foster Jr., a nuclear physicist, former head of the Lawrence Livermore National Laboratory (then called the Lawrence Radiation Laboratory), and—in 1971, when the idea occurred to him—the director of defense research and engineering, the top scientific post in the Pentagon. Foster was a longtime model-airplane enthusiast, and one day he realized that his hobby could make for a new kind of weapon. His idea: take an unmanned, remote-controlled airplane, strap a camera to its belly, and fly it over enemy targets to snap pictures or shoot film; if possible, load it with a bomb and destroy the targets, too.
Two years later, the Defense Advanced Research Projects Agency (DARPA) built two prototypes based on Foster’s concept, dubbed Praeire and Calere. Weighing 75 pounds and powered by a modified lawn-mower engine, each vehicle could stay aloft for two hours while hoisting a 28-pound payload.
Pentagon agencies design lots of prototypes; most of them never get off the drawing board. Foster’s idea became a real weapon because it converged with a new defense doctrine. In the early-to-mid 1970s, the Soviet Union was beefing up its conventional military forces along the border between East and West Germany. A decade earlier, U.S. policy was to deter an invasion of Western Europe by threatening to retaliate with nuclear weapons. But now, the Soviets had amassed their own sizable nuclear arsenal. If we nuked them, they could nuke us back. So DARPA commissioned a study to identify new technologies that might give the president “a variety of response options” in the event of a Soviet invasion, including “alternatives to massive nuclear destruction.”
By the fall of 2009, the Air Force was training more drone-joystick pilots than airplane pilots. It was the start of a new era.
The study was led by Albert Wohlstetter, a former strategist at the RAND Corporation, who in the 1950s and ’60s wrote highly influential briefings and articles on the nuclear balance of power. He pored over various projects that DARPA had on its books and figured that Foster’s unmanned airplanes might fit the bill. In the previous few years, the U.S. military had developed a number of “precision-guided munitions”—products of the microprocessor revolution—that could land within a few meters of a target. Wohlstetter proposed putting the munitions on Foster’s pilotless planes and using them to hit targets deep behind enemy lines—Soviet tank echelons, air bases, ports. In the past, these sorts of targets could have been destroyed only by nuclear weapons, but a small bomb that hits within a few feet of its target can do as much damage as a very large bomb (even a low-yield nuclear bomb) that misses its target by a few thousand feet.
By the end of the 1970s, DARPA and the U.S. Army had begun testing a new weapon called Assault Breaker, which was directly inspired by Wohlstetter’s study. Soon, a slew of super-accurate weapons—guided by laser beams, radar emissions, millimeter waves, or, later (and more accurately), the signals of global positioning satellites—poured into the U.S. arsenal. The Army’s Assault Breaker was propelled by an artillery rocket; the first Air Force and Navy versions, called Joint Direct Attack Munitions (JDAMs), were carried under the wings, and launched from the cockpits, of manned fighter jets.
Something close to Foster’s vision finally materialized in the mid-1990s, during NATO’s air war over the Balkans, with an unmanned aerial vehicle (UAV) called the Predator. It could loiter for 24 hours at an altitude of 25,000 feet, carrying a 450-pound payload. In its first incarnation, it was packed only with video and communications gear. The digital images taken by the camera were beamed to a satellite and then transmitted to a ground station thousands of miles away, where operators controlled the drone’s flight path with a joystick while watching its real-time video stream on a monitor.
In February 2001, the Pentagon and CIA conducted the first test of a modified Predator, which carried not only a camera but also a laser-guided Hellfire missile. The Air Force mission statement for this armed UAV noted that it would be ideal for hitting “fleeting and perishable” targets. In an earlier era, this phrase would have meant destroying tanks on a battlefield. In the opening phase of America’s new war on terror, it meant hunting and killing jihadists, especially Osama bin Laden and his lieutenants in al-Qaeda.
And so a weapon designed at the height of the Cold War to impede a Soviet armor assault on the plains of Europe evolved into a device for killing bands of stateless terrorists—or even an individual terrorist—in the craggy mountains of South Asia. In this sense, drones have hovered over U.S. military policy for more than three decades, the weapons and the policy shifting in tandem over time.
A War without Boundaries
How this came about is another far-from-inevitable story. The rise of the drone met serious resistance from one powerful quarter: the senior officer corps of the United States Air Force, the same organization that developed the weapon. The dominant culture in each of the armed services—the traits that are valued, the kinds of officers who get promoted—is shaped by its big-ticket weapons systems. Thus, from 1947 to 1981, every Air Force chief of staff rose through the ranks as a nuclear bombardier in Strategic Air Command. For the next quarter-century, as spending on conventional forces soared, every chief of staff had been a fighter pilot in Tactical Air Command.
That’s where things stood in 2003, when President George W. Bush ordered the invasion of Iraq. As liberation became an occupation, which sparked an insurgency and then a sectarian civil war, U.S. commanders on the ground requested support from those shiny new Predator drones. The most lethal threat to American soldiers and Marines was the improvised explosive device, or roadside bomb. A drone’s camera in the sky could see an insurgent planting the IED and follow him back to his hideout. But drones (slow, unmanned hovering planes) were anathema to the dominant Air Force culture (which cherished fast, manned jet fighters). So the Air Force generals turned down or ignored the Army and Marine commanders’ pleas for more drones.
The most common criticism is that drones often wind up killing civilians. This is true, but it’s hardly unique to drones.
All this changed in 2006, when Bush named Robert Gates to replace Donald Rumsfeld as secretary of defense. Gates came into the Pentagon with one goal: to clean up the mess in Iraq. He was shocked that the generals in the three big services cared more about high-tech weapons for the wars of the future than the needs of the war they were fighting. He was particularly appalled by the Air Force generals’ hostility toward drones. Gates boosted production; the generals slowed down delivery. He accelerated delivery; they held up deployment. He fired the Air Force chief of staff, General T. Michael Moseley (ostensibly for some other act of malfeasance but really because of his resistance to UAVs), and appointed in his place General Norton Schwartz, who had risen as a gunship and cargo-transport pilot in special operations forces. Just before his promotion, Schwartz had been head of the U.S. Transportation Command—that is, he was in charge of rushing supplies to soldiers and Marines. As the new chief, Schwartz placed high priority on shipping drones to the troops in Iraq—and over the next few years, he turned the drone-joystick pilots into an elite cadre of the Air Force.
By the fall of 2009, toward the end of Barack Obama’s first year as president, the Air Force was training more drone-joystick pilots than airplane-cockpit pilots. It was the start of a new era, not only for Air Force culture but also for the American way of war.
That year, 2009, saw not just a surge in U.S. drone strikes—in part because more drones were available and the institutional resistance to them had evaporated—but also a shift in where those strikes took place. There was nothing politically provocative about drones in Iraq or Afghanistan. They were weapons of war, used mainly for close air support of U.S. ground troops in countries where those troops were fighting wars. The controversy—which persists today—began when drones started hunting and killing specific people in countries where the United States was not officially at war.
These strikes took place mainly in Pakistan and Yemen. Pakistan was serving as a sanctuary for Taliban fighters in neighboring Afghanistan; Yemen was emerging as the center of a new wing of al-Qaeda in the Arabian Peninsula. Bush had ordered a few strikes in those countries: in fact, the first drone strike outside a formal war zone took place in Yemen, on November 3, 2002, against an al-Qaeda leader who a few years earlier had helped plan the attack on the USS Cole. Bush also launched 48 drone strikes in the Waziristan region of Pakistan, along the mountainous border with Afghanistan—36 of them during his last year in office.
Obama, who had pledged during the 2008 presidential campaign to get out of Iraq and deeper into Afghanistan, accelerated this trend, launching 52 drone strikes on Pakistani territory just in his first year. In 2010 he more than doubled the number of these strikes, to 122. Then, the next year, the number fell off, to 73. In 2012 it declined further, to 48—which still equaled the total number of strikes in all eight years of Bush’s presidency. In a contrary shift, 2012 was also the year when the number of drone strikes soared in Yemen, from a mere handful to 54.
These strikes have provoked violent protest in those countries, alienating even those who’d previously felt no affection for jihadists and, in some cases, provided some support for the United States. At home, a political and legal debate rages over the wisdom and propriety of drone strikes as a tool in the war on terror.
Heightening the controversy is the fact that everything about these strikes outside war zones—including, until recently, their occurrence—is secret. Drone strikes in Iraq and Afghanistan, like all other military operations, have been conducted by the Defense Department. But drone strikes elsewhere are covert operations conducted by the Central Intelligence Agency, which operates in the dark (even congressional oversight is limited to the members of the select intelligence committees) and under a different, more permissive legal authority (Title 50 of the U.S. Code, not the Defense Department’s Title 10).