2_guerreParProcuration2
π Geopolitics π Digital
Asymmetrical warfare: new strategies on the battlefield

Surrogate warfare: has technology opened new doors?

Richard Robert, Journalist and Author
On October 27th, 2021 |
4 min reading time
Andreas Krieg
Andreas Krieg
Senior lecturer at King's College London and research associate at the Institute of Middle Eastern Studies
Key takeaways
  • Surrogate warfare is a way for states to remain engaged in endless wars, which take place in a grey area between war and peace.
  • It involves outsourcing and delegating armed operations to the market: private military and security companies, rebel groups and militias, non-state actors.
  • Surrogate warfare also plays out with technological tools and information wars.
  • The surrogates also include non-human agents, from robotics to artificial intelligence.

Is there any­thing new about ‘sur­ro­gate’ warfare?

Andreas Krieg. Sur­ro­gate war­fare is a con­ven­tion­al con­cept, used for instance by the US in the late 70s when they trained, fund­ed and equipped the Mujahideen against the Rus­sians. A clas­sic exam­ple is the begin­ning of the British Empire. Britain was able to rule India with just 10,000 British peo­ple, and they did this by build­ing up local sur­ro­gates that did their fight­ing local­ly. With the East India Com­pa­ny, a com­pa­ny was del­e­gat­ed the pow­er to admin­is­ter a ter­ri­to­ry and use mer­ce­nar­ies to pro­tect their properties.

How­ev­er, the glob­al con­text has changed in terms of how and when we use sur­ro­gates. The main fac­tor is an aver­sion for kinet­ic oper­a­tions: not only in the West, but in Rus­sia, Chi­na and oth­er coun­tries, today’s deci­sion mak­ers don’t want to launch major com­bat oper­a­tions. The UN sys­tem works in such robust ways that con­ven­tion­al state on state war is now frowned upon.

Less con­ven­tion­al wars don’t mean no conflict.

But less con­ven­tion­al wars don’t mean no con­flict. With increas­ing com­pe­ti­tion between great pow­ers, as well as the exis­tence of unsta­ble zones where con­flict­ing inter­ests are at stake, we live in a state of semi-per­ma­nent cri­sis that can spill into a major con­flict. The strat­e­gy then is to under­mine your oppo­nent with­out hav­ing to cross the thresh­old into a prop­er war. That’s where sur­ro­gates come in.

Are they capa­ble of achiev­ing the same objec­tives as con­ven­tion­al armies?

The strate­gic end of what states are try­ing to achieve is no longer hold and build, as in the 20th cen­tu­ry, where we were try­ing to push out an ene­my, clear a ter­ri­to­ry and rebuild. The goal now is just to dis­rupt our adver­saries and increase our influ­ence. Sur­ro­ga­cy has very lim­it­ed rel­e­vance for pow­er itself, but it’s a game chang­er when you want to achieve influ­ence. Influ­ence is built through net­works and net­work build­ing implies del­e­gat­ing to dif­fer­ent actors.

Just as it doesn’t pro­vide absolute con­trol, sur­ro­gate war­fare can­not achieve an absolute vic­to­ry. But have we ever been able to achieve it? The answer is prob­a­bly no, though we used to have fair­ly robust strate­gic objec­tives in the 20th cen­tu­ry, when we engaged in war.

When we engage in sur­ro­gate war­fare, we don’t have such objec­tives. The polit­i­cal rea­sons for going to war are nev­er clear. We end up being com­mit­ted in con­flict for an indef­i­nite peri­od in loca­tions far removed from our own Met­ro­pol­i­tan home­land, which makes it very dif­fi­cult to sell this war to the media and the pub­lic. But we want to remain engaged, and this is what sur­ro­ga­cy allows us to do.

We can remain engaged in con­flicts that are not vital for our nation­al inter­ests, with very lit­tle demo­c­ra­t­ic over­sight and account­abil­i­ty, and with plau­si­ble deniability.

What you cre­ate through sur­ro­gates is com­plex: it’s an assem­blage bring­ing togeth­er state actors, non-state actors and tech­nol­o­gy, a net­work that is dif­fi­cult to unrav­el. Every­one has a degree of plau­si­ble deni­a­bil­i­ty. This dis­cre­tion allows to do things dis­creet­ly with­out par­lia­men­tary over­sight, with­out checks and bal­ances, and it allows what I call “cab­i­net war­fare,” just as in the 18th cen­tu­ry when Princes fought wars as they indi­vid­u­al­ly saw fit. In the 20th cen­tu­ry, with wars involv­ing not only pub­lic fund­ing but also citizen’s lives, this kind of war­fare was nat­u­ral­ly lim­it­ed. With sur­ro­gates the accept­abil­i­ty equa­tion is quite different.

What you said seems even more rel­e­vant with non-human proxies.

Indeed. Sur­ro­gates cov­er a broad spec­trum and tech­nolo­gies are a very impor­tant part of it since they are also a force mul­ti­pli­er to the mil­i­tary. Drones have been used both for their effi­cien­cy and to avoid using men and women on the ground – noth­ing new in the kinet­ic realm, it’s an old trend. What is fun­da­men­tal­ly new is hap­pen­ing in the cyber infor­ma­tion domain.

Infor­ma­tion wars are using sur­ro­gate actors to under­mine con­sen­sus build­ing. They use the infor­ma­tion space to influ­ence not just indi­vid­u­als but large com­mu­ni­ties, mobil­is­ing them to do some­thing that they oth­er­wise would­n’t do. It’s war­fare by oth­er means, just as Clause­witz said war­fare was pol­i­tics by oth­er means. It’s fun­da­men­tal­ly chang­ing how war­fare oper­ates because it is again below the thresh­old of war for a strate­gic polit­i­cal end. It is almost unde­tectable and def­i­nite­ly not illegal.

Infor­ma­tion wars use sur­ro­gate actors to under­mine con­sen­sus by using infor­ma­tion to influ­ence not just indi­vid­u­als but large communities.

We have evi­dence for Russ­ian med­dling in the UK, France, Ger­many, and the US. Tar­get­ing dis­course in a democ­ra­cy means that you mobilise civ­il soci­ety to have an impact on pol­i­cy­mak­ing. It is also about chang­ing pol­i­cy rel­e­vant dis­course around peo­ple who make pol­i­cy. Every­one has Rus­sia in mind, but the Unit­ed Arab Emi­rates are an impor­tant case study, because, espe­cial­ly in France, they have been influ­en­tial on chang­ing dis­course on issues rel­a­tive to Islam or the Arab world. By influ­enc­ing aca­d­e­mics or jour­nal­ists, you cre­ate a whole array and an army of sur­ro­gates. The Rus­sians have been weapon­is­ing nar­ra­tives for the two decades, first to defend them­selves, and now offen­sive­ly to under­mine the social polit­i­cal con­sen­sus in our coun­tries through polar­is­ing debates.

The input might come from Rus­sia, but the pro­lif­er­a­tion of con­spir­a­cy the­o­ries hap­pens thanks to domes­tic cit­i­zens, “coin­ci­den­tal sur­ro­gates” who are not direct agents of the Rus­sians. This is the pow­er of net­works. They will spin ideas, dis­in­for­ma­tion, fake news, and weaponised narratives.

War­fare is essen­tial­ly chang­ing wills, Clause­witz said. Sub­ver­sion in the infor­ma­tion space allows us to do exact­ly that with­out ever hav­ing to fight. That does­n’t mean it does­n’t become vio­lent, as we’ve seen the Unit­ed States this year with the infil­tra­tion of weaponised nar­ra­tives in the pub­lic dis­course. The out­come was vio­lent, albeit not “kinet­ic” in the con­ven­tion­al sense.

Beyond this ver­sion of sub­ver­sion, how is sur­ro­gate war­fare expect­ed to evolve in the future?

What hap­pens next is about arti­fi­cial intel­li­gence. It cre­ates a means to com­plete­ly del­e­gate deci­sion-mak­ing and remove your­self from the process. You’re not sup­ple­ment­ing the human brain. You’re sub­sti­tut­ing it.

This is already hap­pen­ing on the oper­a­tional lev­el: AI is part of the robot­ics in the kinet­ic machines that are built today. In Chi­na, a lot of research is done to remove the human from the loop. 15 years ago, the US was very firm: the human should always remain in the loop. The Chi­nese think oth­er­wise, and now the Amer­i­can are say­ing that we too need to do more research into using AI and build­ing sys­tems where the human is no longer in the loop. What we see here is an ero­sion of the human com­po­nent of war­fare. Tech­nol­o­gy is tak­ing the lead.

That sort of rela­tion­ship is dif­fi­cult to accept: you want the patron to con­trol the sur­ro­gate. When it comes to arti­fi­cial intel­li­gence, the human is no longer able to con­trol it. We’re chang­ing all the para­me­ters of sur­ro­ga­cy, because in a patron and sur­ro­gate rela­tion­ships, the patron always has a degree of con­trol. Shall we even­tu­al­ly have to cre­ate machines to con­trol the machines? This is a slip­pery slope that we’re going down.

Our world explained with science. Every week, in your inbox.

Get the newsletter