Zachary Kallenborn, punlished on the Modern War Instititute, May 28, 2020
Drones and AI: this article is largely about projects that have not come to fruition yet. However, it is interesting to see what they are thinking about. I saw demos of a drone swarm back in 2009 at a conference. The were pretty bumbly at the time so we can see here that they have improved. Even so, they are not fine tuned or, apparently, fine tunable. The software has to solve problems on its own, so it cannot be 100% under control. Something to be concerned about.
In 2017, artificial-intelligence researcher Stuart Russell presented the “Slaughterbots” video at a meeting of the UN Convention on Conventional Weapons. When Dr. Russell and the Future of Life Institute released the video on YouTube, it quickly went viral. In the video, fictionalized swarms of drones recognize, target, and kill opponents autonomously. The drones assassinate activists and political leaders, and a slaughterbots manufacturer claims that $25 million of drones can wipe out half a city.
Although slaughterbots are fiction, numerous states are developing both drone-swarm technology and autonomous weapons. Every leg of the US military is developing drone swarms—including the Navy’s swarming boats and the Air Force’s plan to employ swarms in a wide range of military roles, from intelligence collection to suppression of enemy air defenses. Russia, China, South Korea, the United Kingdom, and others are developing swarms too. At the same time, a range of states have developed or are developing autonomous (primarily stationary defensive) weapons, from South Korea’s SGR-A1 gun turret to the United States’ Phalanx close-in weapon system. Combining these technologies creates a slaughterbots-style weapon: an armed, fully autonomous drone swarm—or AFADS. (For the purposes of this article, I define “fully autonomous” to mean weapon systems that are both self-targeting and self-mobile; “drone” as any unmanned platform operating on land, sea, air, or space; and “drone swarms” as the use of multiple drones collaborating to achieve shared objectives.)
Because of this, AFADS should be classified as weapons of mass destruction. As I argue in my new study at the US Air Force Center for Strategic Deterrence Studies, AFADS can exceed any arbitrary threshold for mass casualties and are inherently unable to distinguish between military and civilian targets.
Why Classification Matters (and Why it’s Hard)
Classification of drone swarms as WMD has significant conceptual, legal, and national security implications. Conceptually, understanding whether AFADS are (or are not) WMD requires careful debate over the scope of the term and alternatives. While such drone swarms bare some strong similarities to traditional WMD, they also have major differences. Legally, classifying AFADS as WMD means the Seabed Treaty and the Outer Space Treaty apply to swarms. These treaties limit the placement of WMD in “global commons” areas (the ocean bed and outer space), but they do so without precisely defining WMD. Traditional WMD agents—biological, chemical, and nuclear weapons—also have numerous policies, programs, governmental and international organizations, and treaties aimed at combating their proliferation and providing a framework for responding to their use. From a national-security perspective, classifying AFADS as WMD also has an impact, since the use of WMD, including chemical agents, can radically change public support for military action. If AFADS are WMD, the non- and counter-proliferation policies, treaties, and norms applied to traditional WMD are all worth considering.
However, classifying a particular weapon as WMD is quite hard, as definitions have proliferated more than the weapons themselves. Seth Carus’s comprehensive review of WMD definitions identified twenty different definitions used by the US government alone. In part because of this, other authors disagree with the term “weapons of mass destruction” itself, highlighting its vagueness, the potential for political abuse, and the implication that all traditional WMD agents—biological, chemical, and nuclear agents—are of equal threat.
The terminological debate is too big and broad to resolve here, but regardless of which definition of WMD is preferred (or non-WMD alternative), separating these weapons from conventional weapons implies WMD are inherently different in ways that warrant special attention.
Drone Swarms as WMD
Armed, fully autonomous drone swarms should be classified as WMD because of their degree of potential harm and inherent inability to differentiate between military and civilian targets—both of which are characteristics of existing weapons categorized as WMD.
The scalability of armed drone swarms means they can bypass any arbitrary threshold for defining “mass destruction”—regardless of whether such a definition is pegged to one thousand casualties, two thousand, or any other number. Whereas the size and impact of conventional weapons are limited by a number of factors, few limits exist on drone swarm scalability. Drone platforms are known, relatively easy to acquire technologies. The Center for the Study of the Drone at Bard College has identified ninety-five countries with military drones, comprising 171 different types of drone. The technology is rudimentary enough that basic drones can be bought at Best Buy or 3D printed. Converting drones into a swarm only requires the software and hardware to enable the drones to share information and make decisions and the finances to sustain development and acquisition.
Intel’s rapidly improving ability to control increasingly larger numbers of drones illustrates the ease of scaling. In 2016 the company flew one hundred drones simultaneously. In 2017 it flew three hundred drones. By 2018 it managed to fly 1,218 drones then 2,018. Give all 2,018 drones bombs and the collective certainly could inflict mass casualties.
Of course, the exact amount of harm is highly context dependent. Defenders may be armed with counter-drone systems or sophisticated air defenses. If slaughterbots become truly ubiquitous, states may just hang nets everywhere. Conversely, the flexible nature of drone swarms allows them to incorporate adaptations, such as standoff or chemical weapons. Drone swarms may also operate in multiple domains and incorporate antitank weapons, electronic-warfare equipment, or other systems that increase survivability.
Fortunately, so far few examples exist to judge drone swarms’ capacity for harm. The closest example occurred in January 2018, when Syrian rebels launched ten crude drones en masse against a Russian military base in Syria. Although the Russian military claimed it defeated the drones, the Free Alawite movement claimed to have destroyed an S-400 missile launcher valued at $400 million. Evidence on the damage is minimal and both actors have strong incentives to exaggerate or outright lie, so the exact harm is difficult to judge.
The nature of drone swarms incentivizes high levels of autonomy. As the number of drones in the swarm grows, the difficulty in controlling them does too. The activities of each drone must be coordinated to achieve objectives and prevent collisions. As the number of drones becomes truly massive, human control over the swarm may be impossible. Already, US Air Force drone operators experience high staff shortages and higher rates of burnout compared to other career fields.
Autonomously determining whether a target is valid is an extremely difficult task. Consider an armed enemy soldier in uniform. Ostensibly, that’s an obvious legitimate target—unless they were sick or injured. But if the soldier was pointing a weapon back, then even an injured soldier might be a valid target. Even more fundamentally, the autonomous system must be able to effectively distinguish between armed versus unarmed, enemies versus friendlies, and uniforms versus civilian clothing. Even if the system can reliably distinguish between a rake and a rifle, it would need to do so under difficult conditions where the object is obscured or disguised. Reliable discrimination may require near-human levels of artificial intelligence, which is unlikely to be possible in the near future, if ever.
The degree of difficulty will also depend on the domain of operation. Sea-based swarms (either surface drones or aerial drones used at sea) will face far fewer environmental obstructions than ground-based swarms. On the open ocean, tree branches will not obscure an adversary ship. Likewise, military vessels may be more readily distinguishable from civilian vessels due to the very different designs and the presence of large weapon systems. Nonetheless, the relative ease of discrimination is just that—relative—and it is still a major challenge to address.
Some states may elect never to develop such a weapon, because of these practical difficulties or for ethical reasons; however, it should be assumed that some states will. From the firebombing of Dresden to the Syrian government’s use of chemical weapons and various African genocides, numerous states have chosen not to worry about civilian casualties in pursuit of their military objectives. Iraq even sought it as a strategic goal: a terrified populace is less of a threat to the regime. Just as the AK-47 spread to unstable regions around the world, why not autonomous drones?
The United States should limit the proliferation of armed, fully autonomous drone swarms, establish norms against their use, seriously consider military force if they are used, and prepare the US military for the possibility of conflict. Specifically, the United States should consider taking several steps.
First, the US government could formally express its position that AFADS should be considered WMD. Broad recognition would help develop international norms against AFADS and encourage discussion over substantive responses. Particularly, the United States should take the position that AFADS fall under the scope of the Seabed Treaty and the Outer Space Treaty. Banning AFADS from use in outer space and the seabed may have secondary national-security benefits, such as reducing the risk from drone swarms to sea-based nuclear forces.
Second, the United States should expand the scope of counter-WMD organizations. The United States should evaluate whether governmental organizations and international organizations concerned with countering WMD should incorporate AFADS. As AFADS are still an emerging threat, initial efforts should focus on preventing emergence and nonproliferation. Focus should be on codifying norms against usage in international treaties, expanding export-control regimes to incorporate AFADS, and developing punishing policies for violators. the Department of State’s Bureau of International Security and Nonproliferation and the Department of Commerce’s Bureau of Industry and Security are likely to be key players.
Third, Washington should explore verification and confidence methods. Verifying the use of AFADS is likely to be highly difficult, because one key aspect—full autonomy—exists primarily in code. A variety of proposals have been developed to address this problem with autonomous weapons in general, and they may be applicable to swarms. For example, states could develop control systems requiring operator authentication, air-gapped firing authorizations, and information sharing on methods to ensuring safe operation. Efforts to identify and develop effective methods should be undertaken in collaboration with other governments.
Finally, the United States and the broader international community should debate whether the use of AFADS is sufficient to merit military intervention in a conflict. This debate should focus on the scale of usage and nature of the target (military vs. civilian). The use of a ten thousand–drone swarm on a civilian population might merit intervention, but the use of a two-drone swarm against a military base during an ongoing conflict probably should not. Establishing an exact threshold is likely to be impossible; however, states may identify scenarios and broad factors that would support or reject intervention. States should also consider options below the level of military force (e.g., sanctions), and collaborate with the international humanitarian law community to identify existing legal frameworks aimed at restricting weapons that cause excessive civilian harm.
Drone swarm technology, particularly self-targeting, self-mobile drone swarms, poses a significant risk to global security. Failing to develop international norms, supported by robust policies, to prevent and counter AFADS emergence risks a less secure United States and a far more dangerous world.
Zachary Kallenborn is a Senior Consultant at ABS Group, specializing in unmanned systems (swarms), WMD terrorism, and WMD warfare writ large. His research has been publishing in the Nonproliferation Review, Studies in Conflict and Terrorism, War on the Rocks, DefenseOne, and other outlets. His most recent study, “Are Drone Swarms Weapons of Mass Destruction?” examines whether drone swarms should be considered WMD and their ability to serve in traditional WMD roles.
The views expressed are those of the author and do not reflect the official position of the United States Military Academy, Department of the Army, Department of Defense, or any of the author’s current or former employers or funders.
Image credit: Pvt. James Newsome, US Army