2_guerreParProcuration2
π Geopolitics π Digital
Asymmetrical warfare: new strategies on the battlefield

Surrogate warfare: has technology opened new doors?

with Richard Robert, Journalist and Author
On October 27th, 2021 |
4min reading time
Andreas Krieg
Andreas Krieg
Senior lecturer at King's College London and research associate at the Institute of Middle Eastern Studies
Key takeaways
  • Surrogate warfare is a way for states to remain engaged in endless wars, which take place in a grey area between war and peace.
  • It involves outsourcing and delegating armed operations to the market: private military and security companies, rebel groups and militias, non-state actors.
  • Surrogate warfare also plays out with technological tools and information wars.
  • The surrogates also include non-human agents, from robotics to artificial intelligence.

Is there any­thing new about ‘sur­rog­ate’ warfare?

Andreas Krieg. Sur­rog­ate war­fare is a con­ven­tion­al concept, used for instance by the US in the late 70s when they trained, fun­ded and equipped the Mujahideen against the Rus­si­ans. A clas­sic example is the begin­ning of the Brit­ish Empire. Bri­tain was able to rule India with just 10,000 Brit­ish people, and they did this by build­ing up loc­al sur­rog­ates that did their fight­ing loc­ally. With the East India Com­pany, a com­pany was del­eg­ated the power to admin­is­ter a ter­rit­ory and use mer­cen­ar­ies to pro­tect their properties.

How­ever, the glob­al con­text has changed in terms of how and when we use sur­rog­ates. The main factor is an aver­sion for kin­et­ic oper­a­tions: not only in the West, but in Rus­sia, China and oth­er coun­tries, today’s decision makers don’t want to launch major com­bat oper­a­tions. The UN sys­tem works in such robust ways that con­ven­tion­al state on state war is now frowned upon.

Less con­ven­tion­al wars don’t mean no conflict.

But less con­ven­tion­al wars don’t mean no con­flict. With increas­ing com­pet­i­tion between great powers, as well as the exist­ence of unstable zones where con­flict­ing interests are at stake, we live in a state of semi-per­man­ent crisis that can spill into a major con­flict. The strategy then is to under­mine your oppon­ent without hav­ing to cross the threshold into a prop­er war. That’s where sur­rog­ates come in.

Are they cap­able of achiev­ing the same object­ives as con­ven­tion­al armies?

The stra­tegic end of what states are try­ing to achieve is no longer hold and build, as in the 20th cen­tury, where we were try­ing to push out an enemy, clear a ter­rit­ory and rebuild. The goal now is just to dis­rupt our adversar­ies and increase our influ­ence. Sur­rog­acy has very lim­ited rel­ev­ance for power itself, but it’s a game changer when you want to achieve influ­ence. Influ­ence is built through net­works and net­work build­ing implies del­eg­at­ing to dif­fer­ent actors.

Just as it doesn’t provide abso­lute con­trol, sur­rog­ate war­fare can­not achieve an abso­lute vic­tory. But have we ever been able to achieve it? The answer is prob­ably no, though we used to have fairly robust stra­tegic object­ives in the 20th cen­tury, when we engaged in war.

When we engage in sur­rog­ate war­fare, we don’t have such object­ives. The polit­ic­al reas­ons for going to war are nev­er clear. We end up being com­mit­ted in con­flict for an indef­in­ite peri­od in loc­a­tions far removed from our own Met­ro­pol­it­an home­land, which makes it very dif­fi­cult to sell this war to the media and the pub­lic. But we want to remain engaged, and this is what sur­rog­acy allows us to do.

We can remain engaged in con­flicts that are not vital for our nation­al interests, with very little demo­crat­ic over­sight and account­ab­il­ity, and with plaus­ible deniability.

What you cre­ate through sur­rog­ates is com­plex: it’s an assemblage bring­ing togeth­er state act­ors, non-state act­ors and tech­no­logy, a net­work that is dif­fi­cult to unravel. Every­one has a degree of plaus­ible deni­ab­il­ity. This dis­cre­tion allows to do things dis­creetly without par­lia­ment­ary over­sight, without checks and bal­ances, and it allows what I call “cab­in­et war­fare,” just as in the 18th cen­tury when Princes fought wars as they indi­vidu­ally saw fit. In the 20th cen­tury, with wars involving not only pub­lic fund­ing but also citizen’s lives, this kind of war­fare was nat­ur­ally lim­ited. With sur­rog­ates the accept­ab­il­ity equa­tion is quite different.

What you said seems even more rel­ev­ant with non-human proxies.

Indeed. Sur­rog­ates cov­er a broad spec­trum and tech­no­lo­gies are a very import­ant part of it since they are also a force mul­ti­pli­er to the mil­it­ary. Drones have been used both for their effi­ciency and to avoid using men and women on the ground – noth­ing new in the kin­et­ic realm, it’s an old trend. What is fun­da­ment­ally new is hap­pen­ing in the cyber inform­a­tion domain.

Inform­a­tion wars are using sur­rog­ate act­ors to under­mine con­sensus build­ing. They use the inform­a­tion space to influ­ence not just indi­vidu­als but large com­munit­ies, mobil­ising them to do some­thing that they oth­er­wise would­n’t do. It’s war­fare by oth­er means, just as Clause­witz said war­fare was polit­ics by oth­er means. It’s fun­da­ment­ally chan­ging how war­fare oper­ates because it is again below the threshold of war for a stra­tegic polit­ic­al end. It is almost undetect­able and def­in­itely not illegal.

Inform­a­tion wars use sur­rog­ate act­ors to under­mine con­sensus by using inform­a­tion to influ­ence not just indi­vidu­als but large communities.

We have evid­ence for Rus­si­an med­dling in the UK, France, Ger­many, and the US. Tar­get­ing dis­course in a demo­cracy means that you mobil­ise civil soci­ety to have an impact on poli­cy­mak­ing. It is also about chan­ging policy rel­ev­ant dis­course around people who make policy. Every­one has Rus­sia in mind, but the United Arab Emir­ates are an import­ant case study, because, espe­cially in France, they have been influ­en­tial on chan­ging dis­course on issues rel­at­ive to Islam or the Arab world. By influ­en­cing aca­dem­ics or journ­al­ists, you cre­ate a whole array and an army of sur­rog­ates. The Rus­si­ans have been weapon­ising nar­rat­ives for the two dec­ades, first to defend them­selves, and now offens­ively to under­mine the social polit­ic­al con­sensus in our coun­tries through polar­ising debates.

The input might come from Rus­sia, but the pro­lif­er­a­tion of con­spir­acy the­or­ies hap­pens thanks to domest­ic cit­izens, “coin­cid­ent­al sur­rog­ates” who are not dir­ect agents of the Rus­si­ans. This is the power of net­works. They will spin ideas, dis­in­form­a­tion, fake news, and weapon­ised narratives.

War­fare is essen­tially chan­ging wills, Clause­witz said. Sub­ver­sion in the inform­a­tion space allows us to do exactly that without ever hav­ing to fight. That does­n’t mean it does­n’t become viol­ent, as we’ve seen the United States this year with the infilt­ra­tion of weapon­ised nar­rat­ives in the pub­lic dis­course. The out­come was viol­ent, albeit not “kin­et­ic” in the con­ven­tion­al sense.

Bey­ond this ver­sion of sub­ver­sion, how is sur­rog­ate war­fare expec­ted to evolve in the future?

What hap­pens next is about arti­fi­cial intel­li­gence. It cre­ates a means to com­pletely del­eg­ate decision-mak­ing and remove your­self from the pro­cess. You’re not sup­ple­ment­ing the human brain. You’re sub­sti­tut­ing it.

This is already hap­pen­ing on the oper­a­tion­al level: AI is part of the robot­ics in the kin­et­ic machines that are built today. In China, a lot of research is done to remove the human from the loop. 15 years ago, the US was very firm: the human should always remain in the loop. The Chinese think oth­er­wise, and now the Amer­ic­an are say­ing that we too need to do more research into using AI and build­ing sys­tems where the human is no longer in the loop. What we see here is an erosion of the human com­pon­ent of war­fare. Tech­no­logy is tak­ing the lead.

That sort of rela­tion­ship is dif­fi­cult to accept: you want the pat­ron to con­trol the sur­rog­ate. When it comes to arti­fi­cial intel­li­gence, the human is no longer able to con­trol it. We’re chan­ging all the para­met­ers of sur­rog­acy, because in a pat­ron and sur­rog­ate rela­tion­ships, the pat­ron always has a degree of con­trol. Shall we even­tu­ally have to cre­ate machines to con­trol the machines? This is a slip­pery slope that we’re going down.

Support accurate information rooted in the scientific method.

Donate