Home / Chroniques / How gamification can build cognitive resilience to complex crisis situations
Généré par l'IA / Generated using AI
π Science and technology π Geopolitics

How gamification can build cognitive resilience to complex crisis situations

Jean LANGLOIS-BERTHELOT
Jean Langlois-Berthelot
Doctor of Applied Mathematics and Head of Division in the French Army
Christophe Gaie
Christophe Gaie
Head of the Engineering and Digital Innovation Division at the Prime Minister's Office
Key takeaways
  • Cognitive resilience is the ability to keep a cool head, adapt and make better decisions in stressful situations and when faced with information overload.
  • Traditional methods of training cognitive resilience have shown their limitations, and these tools need to be modernised to better prepare people.
  • To do this, simulation and gamification enable more dynamic, interactive experiences that are closer to the current complex conditions of crises.
  • Gamification provides a framework that stimulates cognitive complexity, forcing decisions to be made in uncertain environments.
  • However, the development of these technologies raises ethical and deontological questions, such as data protection and the use of data without manipulative or conditioning abuses.

Faced with the acce­le­ra­tion of increa­sin­gly com­plex and uncer­tain inter­na­tio­nal crises, the issue of cog­ni­tive resi­lience has become cen­tral. This concept, which refers to the abi­li­ty to keep a cool head, adapt and make the best deci­sions in situa­tions of stress, infor­ma­tion over­load and ambi­gui­ty, is now a major chal­lenge for both mili­ta­ry and civi­lian decision-makers.

Unfor­tu­na­te­ly, tra­di­tio­nal trai­ning methods based on fixed pro­ce­dures and rela­ti­ve­ly pre­dic­table sce­na­rios are revea­ling their limi­ta­tions. They do not suf­fi­cient­ly reflect the chan­ging, some­times chao­tic rea­li­ty in which actors must ope­rate today. This is where simu­la­tion and gami­fi­ca­tion come into play, offe­ring expe­riences that are more dyna­mic, inter­ac­tive and, above all, clo­ser to the real com­plexi­ty of crises1.

The inadequacy of traditional simulations

Cri­sis mana­ge­ment trai­ning has relied on simu­la­tors desi­gned to repro­duce fair­ly linear situa­tions where out­comes are rela­ti­ve­ly pre­dic­table. This approach is effec­tive when it comes to lear­ning pro­ce­dures or reflexes, but proves insuf­fi­cient for pre­pa­ring for hybrid crises, cyber crises, or asym­me­tric conflicts where infor­ma­tion is incom­plete, unclear, or even contra­dic­to­ry. What should be done if attacks simul­ta­neous­ly hit seve­ral vital infra­struc­tures such as the ener­gy2, tele­com­mu­ni­ca­tions3 and infor­ma­tion4 sectors ?

Under these condi­tions, deci­sion-making becomes a real cog­ni­tive chal­lenge, influen­ced by men­tal load and emo­tions, as well as some­times uncons­cious biases5. These essen­tial human aspects are too often absent from tra­di­tio­nal trai­ning, which limits its effec­ti­ve­ness in unpre­dic­table situa­tions6. It is the­re­fore neces­sa­ry to moder­nise the tools at our dis­po­sal in order to be bet­ter prepared.

Major powers at the forefront of cognitive technologies

To meet these chal­lenges, seve­ral coun­tries have deve­lo­ped inno­va­tive solu­tions. In the Uni­ted States, for example, DARPA (Defense Advan­ced Research Pro­jects Agen­cy, a mili­ta­ry tech­no­lo­gy research and deve­lop­ment orga­ni­sa­tion) is focu­sing on immer­sive simu­la­tors that incor­po­rate bio­me­tric sen­sors to mea­sure ope­ra­tors’ cog­ni­tive load and emo­tio­nal state in real time. This makes it pos­sible to auto­ma­ti­cal­ly adjust the dif­fi­cul­ty of the exer­cises and offer per­so­na­li­sed trai­ning7.

In Chi­na, the approach is even more ambi­tious, with the deve­lop­ment of brain-machine inter­faces aimed at direct­ly increa­sing cog­ni­tive abi­li­ties, such as alert­ness or memo­ry, in high­ly com­plex simu­la­ted envi­ron­ments8. In Europe, a more inter­dis­ci­pli­na­ry approach is being taken, com­bi­ning arti­fi­cial intel­li­gence, cog­ni­tive science and social ana­ly­sis to bet­ter model human deci­sion-making in cri­sis situa­tions9. Howe­ver, the inte­gra­tion of this research into trai­ning sys­tems has yet to become widespread.

The deve­lop­ment of these tech­no­lo­gies raises ethi­cal and pro­fes­sio­nal conduct issues, such as data use and protection.

Never­the­less, the deve­lop­ment of these tech­no­lo­gies raises cru­cial ethi­cal and deon­to­lo­gi­cal ques­tions. It is impe­ra­tive to gua­ran­tee data pro­tec­tion and ensure that these cog­ni­tive aug­men­ta­tion tools are used within a strict ethi­cal fra­me­work, without mani­pu­la­tive or condi­tio­ning bias and in the inter­ests of citi­zens, i.e. not exclu­si­ve­ly for com­mer­cial or mili­ta­ry pur­poses10.

Gamification : more than just a fun tool

Gami­fi­ca­tion is some­times seen as a way to make lear­ning more fun, but its poten­tial goes far beyond that. When well desi­gned, it pro­vides a fra­me­work that simu­lates cog­ni­tive com­plexi­ty, for­cing par­ti­ci­pants to make deci­sions in uncer­tain envi­ron­ments, with conse­quences that can be unpre­dic­table11. These serious games, par­ti­cu­lar­ly those offe­ring mul­tiple-branch sce­na­rios, are effec­tive trai­ning grounds for deve­lo­ping adap­ta­bi­li­ty, stress mana­ge­ment and deci­sion-making under pres­sure12. In cyber cri­sis trai­ning, for example, their effec­ti­ve­ness has been confir­med by seve­ral stu­dies13.

Fur­ther­more, the inte­gra­tion of gene­ra­tive arti­fi­cial intel­li­gence into the desi­gn of these serious games paves the way for even more dyna­mic sce­na­rios, where AI can gene­rate events, cha­rac­ters or twists in real time, making each trai­ning ses­sion unique and requi­ring increa­sed cog­ni­tive adap­ta­bi­li­ty14.

French innovations : discreet but concrete work

Bet­ween 2022 and 2024, SRAT (Sys­tem Risk Assess­ment and Tech­no­lo­gy) car­ried out seve­ral pro­jects that per­fect­ly illus­trate this dyna­mic. The team deve­lo­ped a simu­la­tor based on the Uni­ty engine, recrea­ting a cri­sis mana­ge­ment centre in a conflict zone where users’ deci­sions influence the course of events, thus intro­du­cing the notion of uncer­tain­ty and unpre­dic­ta­bi­li­ty. Cou­pled with an elec­troen­ce­pha­lo­gra­phy (EEG) inter­face, this simu­la­tor mea­sures users’ cog­ni­tive load and emo­tions in real time, pro­vi­ding valuable feed­back on their deci­sion-making mecha­nisms. At the same time, SRAT wor­ked on the C‑RAND pro­ject, a bench­mark for emer­ging tech­no­lo­gies in cyber-cog­ni­tive inter­faces, in col­la­bo­ra­tion with the Bri­tish Army. This work enabled the selec­tion of the most sui­table solu­tions for inte­gra­tion into trai­ning and ope­ra­tions sys­tems15.

Ano­ther notable pro­ject is the deve­lop­ment of a digi­tal serious game in HTML, used to train both Master’s stu­dents and offi­cers at the École Mili­taire Inter­armes. This game simu­lates a cyber cri­sis with mul­tiple rami­fi­ca­tions, for­cing players to manage infor­ma­tion over­load and make quick and rele­vant deci­sions. Feed­back shows a signi­fi­cant impro­ve­ment in adap­ta­bi­li­ty and stress mana­ge­ment skills16. Final­ly, SRAT has pro­vi­ded more than 100 hours of tea­ching ; com­bi­ning arti­fi­cial intel­li­gence, cog­ni­tive science and stra­te­gy within the Higher Mili­ta­ry Scien­ti­fic and Tech­ni­cal Edu­ca­tion (EMSST), thus pre­pa­ring future exe­cu­tives for the cog­ni­tive chal­lenges of contem­po­ra­ry crises.

Simu­la­tion and gami­fi­ca­tion are stra­te­gic tools that can trans­form trai­ning and increase the cog­ni­tive resi­lience of decision-makers.

To maxi­mise the impact of these tools on lear­ner trai­ning, the role of debrie­fing and struc­tu­red feed­back is cru­cial. Post-simu­la­tion ana­ly­sis, where the data col­lec­ted (cog­ni­tive load, emo­tions, deci­sions) is used for per­so­na­li­sed feed­back, enables par­ti­ci­pants to unders­tand their biases and inte­grate their lear­ning in a las­ting way.

Towards enhanced cognitive sovereignty

These dif­ferent approaches clear­ly show that simu­la­tion and gami­fi­ca­tion are not mere gad­gets, but stra­te­gic tools capable of trans­for­ming trai­ning and increa­sing the cog­ni­tive resi­lience of deci­sion-makers. In a world where infor­ma­tion is a lever of power and where the speed of deci­sion-making can make the dif­fe­rence bet­ween suc­cess and fai­lure, it is essen­tial to streng­then this abi­li­ty to adapt and make deci­sions under stress17. Cog­ni­tive sove­rei­gn­ty, like tech­no­lo­gi­cal sove­rei­gn­ty, is now a mat­ter of natio­nal security.

It is also impor­tant to consi­der that cog­ni­tive resi­lience is not sole­ly the pre­serve of deci­sion-makers ; an approach open to eve­ry citi­zen would raise awa­re­ness and pre­pare a wider audience for the cog­ni­tive chal­lenges of contem­po­ra­ry crises (dis­in­for­ma­tion, rumours, emer­gen­cies). In the cur­rent context of hybrid attacks aimed at sprea­ding dis­in­for­ma­tion, stir­ring up hatred and brea­king natio­nal cohe­sion, rai­sing awa­re­ness is a power­ful lever of resis­tance that needs to be developed.

But it is not just about tech­no­lo­gy. Cog­ni­tive resi­lience is a com­plex pro­cess that requires an inte­gra­ted approach, com­bi­ning tech­no­lo­gy, human unders­tan­ding and appro­priate edu­ca­tion. The work of SRAT in France is lea­ding the way, with a prag­ma­tic, rigo­rous approach that is clo­se­ly ali­gned with the real needs of the armed forces. The chal­lenge now is to ensure the wider dis­se­mi­na­tion and adop­tion of these inno­va­tions, so that cog­ni­tive trai­ning becomes an ope­ra­tio­nal rea­li­ty and an asset in the mana­ge­ment of tomor­row’s crises.

Simu­la­tion and gami­fi­ca­tion open up exci­ting pros­pects for mee­ting the chal­lenge of cog­ni­tive resi­lience. More than ever, these tools must be seen as essen­tial levers for pre­pa­ring deci­sion-makers for the com­plex and uncer­tain rea­li­ties of modern crises. Com­bi­ning tech­no­lo­gi­cal advances with a human-cen­tred approach is the key to buil­ding true cog­ni­tive sove­rei­gn­ty, gua­ran­teeing effi­cien­cy and secu­ri­ty in a rapid­ly chan­ging world.

1Salas, E., Bowers, C. A., & Rho­de­ni­zer, L. (2017). It is not how much you prac­tice, but how you prac­tice : Toward a new para­digm in trai­ning research. Psy­cho­lo­gi­cal Science, 8(4), 270–276.
2IT-Pro. (2025, April 9). Menaces cyber sur le sec­teur éner­gé­tique euro­péen ! iTPro​.fr. https://​www​.itpro​.fr/​m​e​n​a​c​e​s​-​c​y​b​e​r​-​s​u​r​-​l​e​-​s​e​c​t​e​u​r​-​e​n​e​r​g​e​t​i​q​u​e​-​e​u​r​o​peen/
3Les Echos (2024, Novem­ber 18). Câble télé­com rom­pu : Ber­lin et Hel­sin­ki évoquent la « guerre hybride » et la menace russe. Les Echos. https://​www​.lese​chos​.fr/​m​o​n​d​e​/​e​n​j​e​u​x​-​i​n​t​e​r​n​a​t​i​o​n​a​u​x​/​c​a​b​l​e​-​t​e​l​e​c​o​m​-​r​o​m​p​u​-​b​e​r​l​i​n​-​e​t​-​h​e​l​s​i​n​k​i​-​e​v​o​q​u​e​n​t​-​l​a​-​g​u​e​r​r​e​-​h​y​b​r​i​d​e​-​e​t​-​l​a​-​m​e​n​a​c​e​-​r​u​s​s​e​-​2​1​32308
4Mar­tin, O. (2024, July 25). Cam­pagnes de dés­in­for­ma­tion, cybe­rat­taques, ingé­rences… le com­bat hybride a déjà com­men­cé, à nous d’y faire face. Le Figa­ro. https://​www​.lefi​ga​ro​.fr/​v​o​x​/​m​o​n​d​e​/​l​e​-​c​o​m​b​a​t​-​h​y​b​r​i​d​e​-​a​-​d​e​j​a​-​c​o​m​m​e​n​c​e​-​a​-​n​o​u​s​-​d​-​y​-​f​a​i​r​e​-​f​a​c​e​-​2​0​2​40725
5Kah­ne­man, D. (2011). Thin­king, fast and slow. Far­rar, Straus and Giroux.
6Para­su­ra­man, R., & Riley, V. (1997). Humans and auto­ma­tion : Use, misuse, disuse, abuse. Human Fac­tors, 39(2), 230–253.
7Gon­za­lez, C., & al. (2021). Adap­tive cog­ni­tive work­load assess­ment using EEG and eye tra­cking in cri­sis simu­la­tions. Neu­roI­mage, 236, 118070.
8Wang, Y., Li, X., & Zhang, Y. (2022). Brain-com­pu­ter inter­faces for enhan­cing cog­ni­tive per­for­mance : A review. Fron­tiers in Neu­ros­cience, 16, 987654.
9Bour­rier, M., & Béna­ben, F. (2021). Cog­ni­tive sys­tems and cri­sis mana­ge­ment : An inte­gra­tive fra­me­work. Cog­ni­tion, Tech­no­lo­gy & Work, 23(1), 15–29.
10Shults, F.L., Wild­man, W.J. (2019). Ethics, Com­pu­ter Simu­la­tion, and the Future of Huma­ni­ty. In : Dial­lo, S., Wild­man, W., Shults, F., Tolk, A. (eds) Human Simu­la­tion : Pers­pec­tives, Insights, and Appli­ca­tions. New Approaches to the Scien­ti­fic Stu­dy of Reli­gion, vol 7. Sprin­ger, Cham. https://doi.org/10.1007/978–3‑030–17090-5_2
11Michael, D., & Chen, S. (2006). Serious games : Games that edu­cate, train, and inform. Mus­ka & Lip­man/­Pre­mier-Trade.
12Connol­ly, T. M., Boyle, E. A., MacAr­thur, E., Hai­ney, T., & Boyle, J. M. (2012). A sys­te­ma­tic lite­ra­ture review of empi­ri­cal evi­dence on com­pu­ter games and serious games. Com­pu­ters & Edu­ca­tion, 59(2), 661–686.
13Ander­son, J., & Rai­nie, L. (2021). The future of cri­sis simu­la­tion and trai­ning. Jour­nal of Cog­ni­tive Engi­nee­ring and Deci­sion Making, 15(2), 90–108.
14
Eun S‑J., Eun J. K., Kim, JY., Arti­fi­cial intel­li­gence-based per­so­na­li­zed serious game for enhan­cing the phy­si­cal and cog­ni­tive abi­li­ties of the elder­ly, Future Gene­ra­tion Com­pu­ter Sys­tems, Volume 141, 2023, Pages 713–722, ISSN 0167–739X,
https://​doi​.org/​1​0​.​1​0​1​6​/​j​.​f​u​t​u​r​e​.​2​0​2​2​.​1​2.017
15
Lan­glois-Ber­the­lot, L. (2024). Syn­thèse des inno­va­tions en inter­faces cyber-cog­ni­tives pour l’Armée de Terre. SRAT
16
Lan­glois-Ber­the­lot, L. (2023). Rap­port interne sur la simu­la­tion et la gami­fi­ca­tion dans la ges­tion de crise.
SRAT
17Woods, D. D., & Holl­na­gel, E. (2019). Resi­lience engi­nee­ring : Concepts and pre­cepts. CRC Press.

Support accurate information rooted in the scientific method.

Donate