1_etatdelart
π Science and technology π Geopolitics
Killer robots: should we be afraid?

Automation, the third revolution in warfare

with Richard Robert, Journalist and Author
On November 9th, 2021 |
3min reading time
Key takeaways
  • Some experts consider autonomous weapons to be the third revolution in warfare technology, after gunpowder and the nuclear bomb.
  • The automation of weapon systems began decades ago. Advances in mobility and the interpretation of environmental information now give them a high degree of autonomy.
  • Development of drones is still ahead of “infantry” robots, which face considerable technical challenges.
  • In their armed version, these systems are mainly used on physical targets. Their use for lethal means is driving ethical debates, but the arms race has already begun.

From guided missiles to autonomous lethal weapons

Guided mis­siles such as the Exo­cet used since the 1980s have some of the same char­ac­ter­ist­ics as autonom­ous robots. When they are in auto­mat­ic mode, they use inform­a­tion about their con­text to adjust their tra­ject­ory. How­ever, the term ‘robot’ is reserved for devices with great­er autonomy. This includes machines cap­able of pro­cessing a wider vari­ety of inform­a­tion, with an exten­ded autonomy from a few minutes to sev­er­al hours, full mobil­ity, and able to make a wider range of decisions (e.g. wheth­er to fire or not).

Great­er autonomy of ‘leth­al autonom­ous weapons’ (LAWs) has been made pos­sible by advances in on-board com­put­ing – namely mini­atur­isa­tion of pro­cessors, increased pre­ci­sion of sensors – and mobil­ity. There are two types of sys­tems: drones and ground robots.

#1 Drones

Most mar­ine and aer­i­al drones are equipped with an auto­mat­ic mode. Ini­tially used for obser­va­tion and recon­nais­sance mis­sions, and later for laser guid­ance, they were armed in the early 2000s in Afgh­anistan and Iraq. The Pred­at­or (gradu­ally replaced by the Reap­er) is still an isol­ated pre­curs­or, but since the 2010s the use of com­bat drones has become more and more frequent.

The Turks have had them since 2012 and equipped the Azer­is dur­ing the war against Armenia in 2020: these mod­els, which are less soph­ist­ic­ated than the Amer­ic­an drones, have had a decis­ive impact. The Rus­si­ans, Indi­ans, Israel­is, South Afric­ans, and Pakistanis are all mak­ing their own drones. The Chinese are mak­ing them, too (Wing Loong 1 and 2, CaiHong 1 to 6) and, unlike the United States, which only sup­plies them to its close allies, are selling them to third-party countries.

UAVs have proven their effect­ive­ness in spe­cif­ic domains, but in high-intens­ity con­flict they are not competitive.

Vari­ous European pro­jects have been developed, some have reached the pro­to­type stage (EADS’ Bar­racuda, Bae’s Taranis), oth­ers have reached a more advanced stage (Dassault’s Neur­on) or have been adop­ted by the armed forces (Safran’s Patroller). The long-post­poned European com­bat drone pro­ject was recently launched and will be oper­a­tion­al around 2028.

Drones have proven their effect­ive­ness in spe­cif­ic theatres (com­bat against ter­ror­ists, region­al con­flicts), but in a high-intens­ity con­flict they are not com­pet­it­ive enough. The devel­op­ment chal­lenges faced are stealth, endur­ance, qual­ity of sensors and increased use of AI. While MALE (Medi­um Alti­tude Long Endur­ance) drones are sev­er­al metres long, ultr­a­light mod­els are emer­ging. In 2021, a team of Chinese research­ers unveiled a pro­to­type amphi­bi­ous drone weigh­ing only 1.5 kg.

The mil­it­ary are now wor­ried about a new threat: small civil­ian drones, equipped with rudi­ment­ary weapons (explos­ives), oper­at­ing in swarms.

#2 Ground robots

Used mainly for defens­ive mis­sions (sur­veil­lance, site pro­tec­tion) or trans­port, land robots are less widely used. Mobil­ity over rough ter­rain poses tech­nic­al prob­lems requir­ing, in the case of ‘legged’ robots such as those of Boston Dynam­ics, tech­nic­al prowess, this also affects the robot­ic ‘mule’ used by the Amer­ic­an army.

Less spec­tac­u­lar but more and more widely used, heavy unmanned ground vehicles (UGVs) moun­ted on tracks are used for trans­port tasks but can also be used to sup­port drone sys­tems. Sim­il­ar to mod­els used in the civil­ian sec­tor and less expens­ive than drones, they are developed by dif­fer­ent man­u­fac­tur­ers, such as Esto­ni­an Milr­em Robot­ics, whose THeMIS was deployed in 2019 in the Barkhane mis­sion in Mali. The Rus­si­an army is one of the few to have armed these vehicles, with the Uran‑9 reportedly being tested in Syria.

Debates

There is much debate sur­round­ing the emer­gence of com­bat drones. The term “Killer Robot” has been pushed by act­iv­ists who opposed their use; par­tic­u­larly around pub­lic fear that these tech­no­lo­gies will be used by nefar­i­ous forces to dom­in­ate a bat­tle­field or even entire populations.

Anoth­er fear relates to the role of AI. In July 2015, an open let­ter on autonom­ous weapons signed by robot­ics and AI research­ers, but also by astro­phys­i­cist Steph­en Hawkins and entre­pren­eurs Elon Musk and Steve Wozniak, expressed con­cern that “we may one day lose con­trol of AI sys­tems through the rise of super intel­li­gence that does not act in accord­ance with humanity’s desires”.

Future debate between states may no longer be about the exist­ence of these sys­tems, but rather the rules of engagement.

More spe­cific­ally, there is a risk of los­ing of con­trol. In 2020, accord­ing to a UN report, a drone in Libya killed its tar­get without a “dir­ect order” 1. This raises tech­nic­al ques­tions: how to avoid los­ing con­trol or hav­ing sys­tems hacked, and sub­stant­ive ques­tions: should autonom­ous mil­it­ary robots be banned? If so, how can the word “autonom­ous” be defined pre­cisely? If not, how do we alloc­ate respons­ib­il­ity for mis­use or malfunction?

It can be argued, how­ever, that the tech­no­logy could poten­tially save lives by avoid­ing civil­ian cas­u­al­ties, or by end­ing wars more quickly. Chris­tof Heyns, who served as UN Spe­cial Rap­por­teur until 2016, argued strongly for a morator­i­um on the devel­op­ment of these sys­tems. His fear was that states would enter into an arms race, and, with a much lower ‘entry cost’ than for nuc­le­ar weapons, rogue states or crim­in­al organ­isa­tions could equip themselves.

But the race has already begun. Future debates between states may no longer be about wheth­er these sys­tems should exist, but rather about the rules of engagement.

1https://​doc​u​ments​-dds​-ny​.un​.org/​d​o​c​/​U​N​D​O​C​/​G​E​N​/​N​2​1​/​0​3​7​/​7​2​/​P​D​F​/​N​2​1​0​3​7​7​2​.​p​d​f​?​O​p​e​n​E​l​ement

Support accurate information rooted in the scientific method.

Donate