April 20, 2024

Would a cyberattack that takes down the Texas power grid in the peak of summer be an act of war?

“There is no spoon.”

The 1999 film, The Matrix, held a critical lesson for its protagonist, Neo. One cannot change physical reality; one can change only oneself. The message came from a young boy who was apparently bending spoons telepathically, but as he hands an unbent spoon to Neo, he said, “Do not try to bend the spoon. It’s impossible. … There is no spoon.” For 21st century strategists, a spoon isn’t the object in question; it’s our understanding of the nature of war. If we reimagine the Matrix as a ‘gray zone’ where competitors threaten the United States below the level of armed conflict, then strategists must bend themselves to encompass a new and broader understanding of war’s nature, even though the nature of war does not and has not changed.

If war is fundamentally a violent clash of political wills, we must broaden our understanding of ‘violence’ to properly define 21st century war. Two problems confront us in this task, both reflect a tendency to privilege one part of the definition of violence over the other. One dictionary defines violence as a physical force, from either human activity or from interaction with tangible things, intended to injure, harm, damage, or kill. However, the classic perspectives on war that I learned at the War College – including Thucydides, Clausewitz, Sun Tzu, and many others – all emphasized war as an activity between humans. Weapons and other tangible objects were means to achieve human-defined ends. Cyberattacks and lethal autonomous weapons systems did not exist, but they do today and they raise important questions about the character of war now and into the future.

The second problem is overemphasis on force intended to kill—with only minimal attention on the intent to hurt or damage. The power to hurt, a hallmark of coercion a la Thomas Schelling, is clearly coercive, but war is understood as the realm of brute force. This unwillingness to understand this ‘gray zone’ between the physical and tangible, and harming, damaging, or killing as war, is one our adversaries will continue to exploit until we adjust our thinking and therefore our approach. For example, all three of the U.S.’ major military competitors have artificial intelligence (AI) development strategies and plans. China’s 2017 New Generation Artificial Intelligence Development Plan and its 2015 Made in China 2025 ambition both outline China’s AI development strategy. China’s defense executives openly acknowledge that autonomous weapon systems will rule the future battlefield — as early as 2025. China’s defense manufacturers already produce autonomous armed drones. Russian President Vladimir Putin similarly views AI as “humanity’s future.” Russia developed a 2019-21 roadmap for AI, which projected $719 million for AI research and development (R&D). Iran ranks number 16 in world AI R&D. Iranian universities began teaching AI 16 years ago. That country’s developments include three variants of military robots, for mine-sweeping, rescue, and other missions, along with add-on weapons’ capabilities. The question is, will these lethal, fully autonomous weapon systems (armed robots), detached from direct human control, rule the future battlefield?

The possibility that the answer is ‘yes’ has spurred action from senior U.S. military leaders. The U.S. Department of Defense (DoD) recently unveiled its Joint Enterprise Defense Infrastructure (JEDI) strategy, to guide its efforts to potentially weaponize AI. In June 2018, DoD established the Joint Artificial Intelligence Center (JAIC) to manage nearly 600 AI projects currently in development for $1.7 billion. The Defense Advanced Research Projects Agency (DARPA) plans to invest an additional $2 billion in AI research over the next five years. However, the absence of established international norms or regulations about using AI systems at war greases a ‘slippery slope,’ down a possible path to unintentional escalation into war.

What happens when robots acting in a ‘gray zone’ … get violent?

Fully autonomous weapons and cyber capabilities increase the likelihood that competition may escalate to war without human activity, with or without the loss of human life. The U.S. military views war as a continuum from cooperation to competition, to ultimately, war. What happens when robots acting in a ‘gray zone,’ that is, between competition and war, get violent? After the 2010 Stuxnet attack on Iran’s Natanz uranium enrichment plant, this question seems less farfetched. Fully autonomous weapons and cyberattacks are inevitable characteristics of future battlefields. How thin runs the line between competition and war without battlefield deaths to tally?

If one accepts that fully autonomous weapons and cyberattacks may be used for armed conflict, how might this armed conflict escalate to war? And here, we must be clear about what constitutes war. But this definitional work is contested, and the different definitions matter. For example, some scholars have suggested using battle-related deaths as parameters to determine whether a conflict counts as a war, establishing that 25 deaths per year means armed conflict, and over 1,000 means war. But what are the consequences of this definition in a world where deaths are less frequent – either intentionally or because of improvements in medical care and treatment? Would a cyberattack that takes down the Texas power grid in the peak of summer be an act of war? What if a competitor’s fully autonomous unmanned aerial vehicle (UAV) inadvertently identifies a fleet of commercial ships as an enemy’s military fleet, and this then sends a swarm attack on inbound merchant vessels? Would an ensuing and significant disruption to the U.S. economy be war? The United States must grapple with such questions soon, before events ever pose them.

British Army Officer Charles E. Callwell stated that, “theory cannot be accepted as conclusive when practice points the other way.” One must accept the irrefutable nature of war not because practice points the other way but because practice illuminates its full range. Violence in war consists of both a human and a non-human element. Wars have historically led to human deaths, but future wars may wreak only bloodless injuries or catastrophic property damages. Nations currently battle in the cyber realm and the future battlefield will include fully autonomous weapon systems. The country that emerges as the world leader in militarized AI technology and cyber capabilities will seize an edge on the battlefield. Now that the gray zone is a modern reality based on the nature of war, will leaders continue trying to ‘bend the spoon’ or will they bend their understanding towards its true nature?

 

Terrie Peterkin, PhD, is a colonel is the U.S. Army and graduate of the U.S. Army War College resident class of 2019. The views expressed in this article are those of the author and do not necessarily reflect those of the U.S. Army War College, U.S. Army, or Department of Defense.

Photo Credit: Anton Holoborodko (Антон Голобородько) via Wikimedia Commons under creative commons “share alike 3.0 unported” license

Photo: Military base at Perevalne during the 2014 Crimean crisis

2 thoughts on “THERE IS NO SPOON: RETHINKING WAR IN THE GRAY ZONE

  1. A public policy discussion over the definition of “war” or an establishment of a “conflict threshold” will be a huge challenge in light of the political changes we see from administration to administration. I thought this was illustrated perfectly when I saw a decades old and important policy (Since we bowed out of the chemical warfare game in the 70’s, the establishment of a policy equating their use with the use of any WMD, and broadcasting that we would treat it that way) erased by the Obama administration with no serious substitute, ostensibly to give the President the freedom to NOT use nukes even though we might be subject to a biological or chemical attack.

    In light of rapidly shifting political winds, how do we select a course/definition and stick to it? With enough lead time to develop the appropriate strategies and capabilities before the political winds change again.

    Unprecedented agility seems to be the only sensible answer.

    Buckle up!…mrb

  2. Reading your article brings to mind two writers: Ayn Rand and Robert Heinlein. First to Rand on what is “The Good.” The Good is anything that continues and improves life. Anything that end or harms life would be considered Bad or Evil. Attacking the power grid, especially during summer or winter, would result in deaths and would be considered an attack on the civilian population. Something more subtle, poisoning water supplies, destroying crops, other attacks on infrastructure, are more in your gray zone, but the intent is to kill or severely harm and more difficult to prove if nobody is claiming responsibility. What if someone with Ebola spent hours on subway trains; what if it were several people in different areas? If I dug a very deep and wide moat around your house, it’s not an attack but takes away your ability to leave, get food/water, get electricity; then all I have to do is wait. That would be Bad/Evil.

    On to Heinlein. The most important question asked in one of his novels was how many of your people have to die before you start a war that will kill even more people? What level of damage/pain will we take before the military is called into action? It took Pearl Harbor attack to get us into World War II, it took nearly 3,000 dead from the 9/11 attacks to get us into Afghanistan. How much of our food supply would have to be destroyed before we attacked the culprits? The enemy is figuring out that attacking Americans in a stand up fight or in large numbers is difficult or results in backlash, but small steps, small attacks can be done successfully and repeatedly to get the desired affect. So the question we’ve been asked in the past, “What about when robots fight our wars?” Well, after my robots destory your robots, I send my robots to kill you.

Leave a Reply

Your email address will not be published. Required fields are marked *

Send this to a friend