Shortly after I turned 18, the United States Marine Corps trained me to live, think, and operate as one of the most lethal humans walking the earth. They transformed me from a typical suburban American kid into their ideal fighting machine through a perfected, scientific regimen of psychological rewiring, physiological restructuring, and moral recoding. After 10 months in the grunt lab, I was assigned to an infantry battalion. I operated with a new kinesiology of the body and soul that had not only prepared me for war but created a thirst for any brand of conflict. I had an understanding of what perfection on the battlefield would look, sound, and taste like. I had become a Battle Bot.
My lethality increased with each personnel addition: from me, the rifleman, to the four-man fire team, the squad, the platoon, the company, the battalion. Each time, add new men, add new hunger, more firepower, more expertise, more technology with which to lay waste to the enemy. As the fighting organism grows in size, so does the inability to pause mission and consider whether the killing is just or moral: the killing just is.
Every generation of American warfighters is handed excellent new gadgets with which to wage war. And who doesn’t love a new toy? Their creators become fabulously rich developing, training the military on, and helping deploy the newest technology. The tech often acquires catchy nomenclature and is extremely effective at killing large numbers of people: think of the MOAB, Mother of all Bombs. Fat Man. Hellfire. Sidewinder.
During Operation Desert Shield, the Barrett .50 caliber semi-automatic sniper rifle arrived in the Saudi Arabian desert where I and my battalion and tens of thousands of other American forces waited for war with Iraq. At the time the military possessed only a few dozen of the weapons, and my sniper team partner and I were two of a select group trained to deploy the Barrett in combat for the first time.
A celebratory, even giddy atmosphere took over the remote desert range that had been purpose-built for us and this new weapon. Division-level officers came out to watch us train. We were served three hot meals a day. At night we burned massive bonfires and discussed our imminent march to war.
My partner, Johnny, was a sergeant, a division-level, school-trained sniper, and—truthfully—a better shot than I was. With the Barrett we both hit iron targets out at 1,600 to 1,800 yards. The shooting was easy. The shooting was fun. We had been gifted this weapon that extended our dark arts by nearly a thousand yards. The Geneva Convention banned us from using a .50 caliber weapon on a human target, so the official reason the weapons had been released to us was to stop enemy vehicles. But we all knew the best way to stop a vehicle is to kill the driver. The technology told us so. And we listened to the technology.
Had I been given the chance, I would have used the Barrett on a human target. Specifically, on his head. At night, sleeping under our Humvee, I dreamed of observing an Iraqi convoy through my scope. Johnny and me in a sniper hide a thousand yards away. Johnny my spotter, me on the Barrett. I fire and put a round into the head of the driver of the first truck and then methodically kill more men from this God-like distance with this menacing new rifle. The Barrett’s technological enhancement of my sniping skills made me more lethal, and more morally compromised, if only in theory and in dream.
Many now profess that the young Marine or soldier with a rifle is obsolete. The greatest weapons race of all is among academic scientists trying to win DARPA funding for new warfighting technology they insist will require scant human interface with the killing act, thus relieving the combatant of the moral quandary and wounds of war. Private-sector startups sell a myth of smart war through AI, or robotic soldiers. In labs where the newest and cleanest ways to kill are being invented, the conversation is not about the morality of going to war, but rather the technology of winning. But when you rely on a myth of technology and distance killing to build a rationale for easy war, your country will lose its soul.
“The enemy meets us where we are weak,” an Air Force pilot friend told me once. In Vietnam, America’s advanced bombers were ambushed by North Vietnam’s lower--quality aircraft, Russian-built MiGs. We should not have lost as many aircraft and pilots as we did.
In Afghanistan, over 18 years of war, we have learned that the Taliban, al Qaeda, and ISIS do not regularly present en masse on the battlefield. The most technologically advanced military in the history of the world cannot claim victory against an enemy that chiefly employs small arms and shoulder-launched grenades and missiles and basic guerrilla tactics. They are nearly impossible to pinpoint, despite our billions of dollars of surveillance satellites and drones. Mostly we find the bad guys with a little bit of luck or a cash payment to a village elder. Paper technology.
Sophisticated weapons systems have one drawback: the enemy must expose himself within the effective killing range in order for the weapon to work as intended. The smart combatant, of course, rarely exposes himself. So when we do home in on enemy fighters, we use a $30 million aircraft to drop a JDAM (joint direct attack munition) and kill a dozen guys living in tents on the side of a mountain. What has that $30 million technological advantage bought us? The highly (and expensively) trained aviator piloting a beautifully complex flying and killing machine just extinguished some men living under canvas and sticks, men with a few thousand rounds of small arms ammo at their disposal. The pilot will return to his expensive air base or carrier. He will have a hot shower, eat hot chow, Skype his wife and children, maybe play some Xbox, and hit the gym before he hits the rack. He will not, nor will he be asked to, concern himself with the men he killed a few hours ago. And in a draw or valley a few klicks away from where the pilot’s munitions impacted, there is another group of men living under extremely basic circumstances, eating boiled rice and maybe a little roasted meat. They will ambush an American convoy or attack a government-friendly village in the morning. Native grit debases our technologically superior forces and materiel. Native grit wins a war.
Imagine if 9/11 had included a ground invasion by a technologically superior enemy. Imagine if they still occupied your city: you and your children would have fought a battle today, with bricks, rifles, and roadside bombs. The attacks on 9/11 activated an American impulse that had been dormant for decades: the will to defend home territory. But since World War II that will, whether real or manufactured by political and journalistic spin, has not translated to a military victory on foreign soil.
The reality is that it’s difficult to locate the morality of and passion for defending an American military outpost built overseas out of Hesco barriers and Geocells. The enemy will hit us where we are weak, and we are weak inside a military compound infiltrated by a single Taliban fighter wearing an Afghan army uniform. His father or brother died on that mountain the other day, or a year ago, or 15 years ago. Inside the base wire we think we are strong and safe, but actually we are weak because we lack a moral necessity for being there. The Taliban fighter fires an AK-47 with a 30-round magazine and kills a few unarmed Americans—a soldier, a CIA operative, a military contractor—plus a friendly Afghan soldier.
We are incapable of stopping this attack because it was not hatched in a university weapons lab funded by DARPA; it was born on the side of a mountain or in a village 100 or more years ago. The impulse for the retaliatory strike is in the young man’s DNA and in the dirt and rain and crops of his home place. Lethal soldiers with lethal weapons have trampled his country and kin for decades—centuries, even. We will never out-tech the deepest passion to persevere and claim victory and sovereignty over one’s own land for one’s own people.
At the street level, war is a people business. And people are complex. They are also fragile. Their bodies break, crumble, split open, and cease operating with surprising ease when met with the awesome newest war technology. The reality of a war-dead civilian or combatant is not changed by how advanced the tool was that delivered the fatal assault to the body.
The lust for new defense technology is an insidious attempt to distance ourselves and our leaders from the moral considerations and societal costs of waging war. It’s not so much about the newest tools—swarm drones, exoskeletons, self-guided sniper projectiles. It is that this reliance on technological cool, the assumption that it lessens or alters the lethality of war, allows zero accountability for how, when, and why we fight.
This is not an anti-intellectual or anti-technology argument. I am not a grunt who thinks wars can only be won with boots on the ground. However, all wars must eventually be won with boots on the ground. The problem is not the technology, but the equivocation that high-tech military armament invariably invites. If fighting war is like swiping your smartphone for an order of groceries or posting a meme to Instagram, how bad can it really be? And if a politician is seduced by the lies and supposed ease of technological warfare and leads us into a mistaken conflict, is it really his or her fault? Didn’t we all think it would be a breeze?
The moral distance a society creates from the killing done in its name will increase the killing done in its name. We allow technology to increase moral distance; thus, technology increases the killing. More civilians than combatants die in modern warfare, so technology increases worldwide civilian murder at the hands of armies large and small.
The person with the least amount of distance from the killing—typically an infantryman or special operator—is the most morally stressed and compromised individual in the war’s chain of command. When close-quarters combatants understand that the killing they have practiced is not backed by a solid moral framework, they question every decision taken on the battlefield. But they also question the meaning of the fight. They count their dead friends on one or even two hands. They count the men they have killed on one or two hands, or by the dozen. The moral math will not compute.
The photos and videos of war on our television screens, on our computers, on our smartphones, tell us nothing about the moral computations of the warfighter. The warfighter understands that when a friend is killed on patrol, that is just part of the package. Another part of the package is going back out on another patrol tomorrow. But as you live and operate for longer in a hostile environment, your hatred of the enemy increases and your trust in leadership decreases. You create a moral wound against yourself.
War was supposed to be easy or fast, because of smart bombs and the latest bit of warfighting technology. But this means nothing when years later you only see dead men, women, and children when you try to sleep.
When we believe the lie that war can be totally wired and digitized, that it can be a Wi-Fi effort waged from unmanned or barely manned fighting apparatus, or that an exoskeleton will help an infantryman fight longer, better, faster, and keep him safe, no one will be held responsible for saying yes to war. The lie that technology will save friendly, civilian, and even enemy lives serves only the politicians and corporate chieftains who profit from war. The lie that technology can prevent war, or even create compassionate combat, is a perverse and profane abuse of scientific thinking.
Humans and technology
Anti-abortion activists are collecting the data they’ll need for prosecutions post-Roe
Body cams and license plates are already being used to track people arriving at abortion clinics.
How China’s biggest online influencers fell from their thrones
Three top livestreaming personalities on the platform Taobao commanded legions of fans who bought billions of dollars’ worth of goods—until they suddenly went dark.
Inside the experimental world of animal infrastructure
Wildlife crossings cut down on roadkill. But are they really a boon for conservation?
Facebook is bombarding cancer patients with ads for unproven treatments
Clinics offering debunked cancer treatments are still allowed to advertise, despite the company’s stated efforts to control medical misinformation.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.