1_etatdelart
π Science and technology π Geopolitics
Killer robots: should we be afraid?

Automation, the third revolution in warfare

Richard Robert, Journalist and Author
On November 9th, 2021 |
3 min reading time
Key takeaways
  • Some experts consider autonomous weapons to be the third revolution in warfare technology, after gunpowder and the nuclear bomb.
  • The automation of weapon systems began decades ago. Advances in mobility and the interpretation of environmental information now give them a high degree of autonomy.
  • Development of drones is still ahead of “infantry” robots, which face considerable technical challenges.
  • In their armed version, these systems are mainly used on physical targets. Their use for lethal means is driving ethical debates, but the arms race has already begun.

From guided missiles to autonomous lethal weapons

Guid­ed mis­siles such as the Exo­cet used since the 1980s have some of the same char­ac­ter­is­tics as autonomous robots. When they are in auto­mat­ic mode, they use infor­ma­tion about their con­text to adjust their tra­jec­to­ry. How­ev­er, the term ‘robot’ is reserved for devices with greater auton­o­my. This includes machines capa­ble of pro­cess­ing a wider vari­ety of infor­ma­tion, with an extend­ed auton­o­my from a few min­utes to sev­er­al hours, full mobil­i­ty, and able to make a wider range of deci­sions (e.g. whether to fire or not).

Greater auton­o­my of ‘lethal autonomous weapons’ (LAWs) has been made pos­si­ble by advances in on-board com­put­ing – name­ly minia­tur­i­sa­tion of proces­sors, increased pre­ci­sion of sen­sors – and mobil­i­ty. There are two types of sys­tems: drones and ground robots.

#1 Drones

Most marine and aer­i­al drones are equipped with an auto­mat­ic mode. Ini­tial­ly used for obser­va­tion and recon­nais­sance mis­sions, and lat­er for laser guid­ance, they were armed in the ear­ly 2000s in Afghanistan and Iraq. The Preda­tor (grad­u­al­ly replaced by the Reaper) is still an iso­lat­ed pre­cur­sor, but since the 2010s the use of com­bat drones has become more and more frequent.

The Turks have had them since 2012 and equipped the Azeris dur­ing the war against Arme­nia in 2020: these mod­els, which are less sophis­ti­cat­ed than the Amer­i­can drones, have had a deci­sive impact. The Rus­sians, Indi­ans, Israelis, South Africans, and Pak­ista­nis are all mak­ing their own drones. The Chi­nese are mak­ing them, too (Wing Loong 1 and 2, Cai­Hong 1 to 6) and, unlike the Unit­ed States, which only sup­plies them to its close allies, are sell­ing them to third-par­ty countries.

UAVs have proven their effec­tive­ness in spe­cif­ic domains, but in high-inten­si­ty con­flict they are not competitive.

Var­i­ous Euro­pean projects have been devel­oped, some have reached the pro­to­type stage (EADS’ Bar­racu­da, Bae’s Tara­nis), oth­ers have reached a more advanced stage (Das­sault’s Neu­ron) or have been adopt­ed by the armed forces (Safran’s Patroller). The long-post­poned Euro­pean com­bat drone project was recent­ly launched and will be oper­a­tional around 2028.

Drones have proven their effec­tive­ness in spe­cif­ic the­atres (com­bat against ter­ror­ists, region­al con­flicts), but in a high-inten­si­ty con­flict they are not com­pet­i­tive enough. The devel­op­ment chal­lenges faced are stealth, endurance, qual­i­ty of sen­sors and increased use of AI. While MALE (Medi­um Alti­tude Long Endurance) drones are sev­er­al metres long, ultra­light mod­els are emerg­ing. In 2021, a team of Chi­nese researchers unveiled a pro­to­type amphibi­ous drone weigh­ing only 1.5 kg.

The mil­i­tary are now wor­ried about a new threat: small civil­ian drones, equipped with rudi­men­ta­ry weapons (explo­sives), oper­at­ing in swarms.

#2 Ground robots

Used main­ly for defen­sive mis­sions (sur­veil­lance, site pro­tec­tion) or trans­port, land robots are less wide­ly used. Mobil­i­ty over rough ter­rain pos­es tech­ni­cal prob­lems requir­ing, in the case of ‘legged’ robots such as those of Boston Dynam­ics, tech­ni­cal prowess, this also affects the robot­ic ‘mule’ used by the Amer­i­can army.

Less spec­tac­u­lar but more and more wide­ly used, heavy unmanned ground vehi­cles (UGVs) mount­ed on tracks are used for trans­port tasks but can also be used to sup­port drone sys­tems. Sim­i­lar to mod­els used in the civil­ian sec­tor and less expen­sive than drones, they are devel­oped by dif­fer­ent man­u­fac­tur­ers, such as Eston­ian Mil­rem Robot­ics, whose THeMIS was deployed in 2019 in the Barkhane mis­sion in Mali. The Russ­ian army is one of the few to have armed these vehi­cles, with the Uran‑9 report­ed­ly being test­ed in Syria.

Debates

There is much debate sur­round­ing the emer­gence of com­bat drones. The term “Killer Robot” has been pushed by activists who opposed their use; par­tic­u­lar­ly around pub­lic fear that these tech­nolo­gies will be used by nefar­i­ous forces to dom­i­nate a bat­tle­field or even entire populations.

Anoth­er fear relates to the role of AI. In July 2015, an open let­ter on autonomous weapons signed by robot­ics and AI researchers, but also by astro­physi­cist Stephen Hawkins and entre­pre­neurs Elon Musk and Steve Woz­ni­ak, expressed con­cern that “we may one day lose con­trol of AI sys­tems through the rise of super intel­li­gence that does not act in accor­dance with humanity’s desires”.

Future debate between states may no longer be about the exis­tence of these sys­tems, but rather the rules of engagement.

More specif­i­cal­ly, there is a risk of los­ing of con­trol. In 2020, accord­ing to a UN report, a drone in Libya killed its tar­get with­out a “direct order” 1. This rais­es tech­ni­cal ques­tions: how to avoid los­ing con­trol or hav­ing sys­tems hacked, and sub­stan­tive ques­tions: should autonomous mil­i­tary robots be banned? If so, how can the word “autonomous” be defined pre­cise­ly? If not, how do we allo­cate respon­si­bil­i­ty for mis­use or malfunction?

It can be argued, how­ev­er, that the tech­nol­o­gy could poten­tial­ly save lives by avoid­ing civil­ian casu­al­ties, or by end­ing wars more quick­ly. Christof Heyns, who served as UN Spe­cial Rap­por­teur until 2016, argued strong­ly for a mora­to­ri­um on the devel­op­ment of these sys­tems. His fear was that states would enter into an arms race, and, with a much low­er ‘entry cost’ than for nuclear weapons, rogue states or crim­i­nal organ­i­sa­tions could equip themselves.

But the race has already begun. Future debates between states may no longer be about whether these sys­tems should exist, but rather about the rules of engagement.

1https://​doc​u​ments​-dds​-ny​.un​.org/​d​o​c​/​U​N​D​O​C​/​G​E​N​/​N​2​1​/​0​3​7​/​7​2​/​P​D​F​/​N​2​1​0​3​7​7​2​.​p​d​f​?​O​p​e​n​E​l​ement

Our world explained with science. Every week, in your inbox.

Get the newsletter