top of page

The Subtle Infiltration of Lethal Autonomous Weapons Systems


In March, Ukrainian drones filmed a spectacular sight: a platoon of unmanned ground vehicles (UGV) conducting an assault unaccompanied by troops or vehicles. Between the crude robots and the drones whizzing overhead, the scene was devoid of humans, a fleeting microcosm of a future battlefield. Admittedly, humans piloted both systems from afar, but it speaks to a larger point: the battlefield is becoming increasingly autonomous, while regulation stagnates. 


A large Russian war propagandist praised the performance of the UGVs, highlighting their ability to operate in conditions where personnel and heavy equipment had been lost and drawing a parallel with the groundbreaking deployment of tanks in the First World War. Certainly, such bravado is at least partly self-aggrandising jingoism. The assault was hardly effective, with Ukrainian drone footage showing the immobilised UGVs subsequently destroyed by one-way attack (OWA) drones. Similarly, unarmed platforms have already been increasingly deployed for logistical support and casualty evacuation. This was not even the first combat use of UGVs, with both repurposed Soviet equipment and specialised systems previously used as “kamikaze” drones


But to over-scrutinise even such incidents is to miss the forest for the trees. Unmanned weapons systems are a prerequisite for lethal autonomous weapons systems (LAWS), and they are proliferating at an alarming rate. In Ukraine, commercial quadcopters and fixed-wing drones already dominate battlefield reconnaissance while loitering munitions and OWA drones conduct tactical strikes. Indeed, Ukrainian drone production is on pace to surpass one million units to partially compensate for munition deficits. Marine drones have scored several high-profile sinkings and helped neuter the Russian Black Sea Fleet. Even beyond Ukraine, however, militaries are planning doctrines with a focus on UGVs and testing drone swarms with varying levels of autonomy. A UN Security Council report even described an engagement in Libya in what might be the first killing of humans by autonomous weapons ever.


The technical obstacles to the deployment of LAWS in terms of hardware and software are thus being resolved. The physical hardware has entered an application phase in the real world, revealing new information about unmanned combat in peer-to-peer conflicts. Evidently, the failures of these systems will continue to be documented and investigated, just as those of the trailblazing Mk I tanks that malfunctioned constantly and caused mass casualties as British commanders initially struggled to incorporate them. Yet the deficiencies of early tanks were resolved, the technology was imitated and proliferated, and the system continues to be relevant despite repeated accusations of obsolescence. Likewise, recent developments in generative AI are widely known, but the extent of the capabilities of the most advanced visual recognition and targeting selection software remain closely guarded military secrets. Already, AI is participating in target selection in real-world combat situations. The motivations are reasonable: troop casualties are politically costly, dynamic battlefields incentivise rapid decision-making, and direct human control may not always be possible in the presence of widespread electronic warfare saturation.


Resistance to proliferation has arguably been insufficient. US military officials, for example, have proclaimed that future systems will not be deployed without “a human responsible for the use of force,” especially as land mines from decades past continue to take civilian lives. The EU’s groundbreaking AI Act explicitly excludes military applications. A UN General Assembly resolution in December 2023 acknowledged the danger of LAWS was supported by 152 countries, but contained no concrete regulation. In the continued absence of tangible multilateral agreements, “killer robots” are not just plausible but may be inevitable. 


It is a future that the wider public also seems ill-prepared for. Perhaps cultural depictions like the Terminators or the droids in Star Wars belie a temporal—even fictional—separation from our current reality. The towering skyscrapers of Metropolis must have also seemed distant in 1927. An evolution similar to that of the tank is possible for unmanned and autonomous systems, given the incentives and the consequences are difficult to fully imagine. Civilian suffering caused by mistaken autonomous target selection is disturbing, though admittedly already a persistent issue for human operators. LAWS accidentally—or intentionally—programmed to maximise human suffering are altogether more terrifying.


However, unmanned and arguably autonomous weapons systems are not just here but are increasingly everywhere. Legal and military experts paying attention must better inform the public of the larger picture. Policymakers must regulate these systems while proliferation is lowest. As for the rest of us humans, we need to continue being in the loop.


 

From Canadian nation-building in the Boer War to combat drones in contemporary Ukraine, Dmytro Sochnyev has dedicated years to studying, writing and talking about topics in international relations & security at the University of Toronto, SciencesPo Paris and the Hertie School in Berlin. Nowadays, he is passionate about uncovering data-driven and outside-of-the-box remedies to issues at the cutting edge of our contemporary security. European defence, Arctic region stability, and emerging military technology governance are existential challenges that require our utmost scrutiny, diligence and creativity. The stakes have never been higher for his generation, but everyone can make a difference.

Comments


bottom of page