Home / Chroniques / How gamification can build cognitive resilience to complex crisis situations
Généré par l'IA / Generated using AI
π Science and technology π Geopolitics

How gamification can build cognitive resilience to complex crisis situations

Jean LANGLOIS-BERTHELOT
Jean Langlois-Berthelot
Doctor of Applied Mathematics and Head of Division in the French Army
Christophe Gaie
Christophe Gaie
Head of the Engineering and Digital Innovation Division at the Prime Minister's Office
Key takeaways
  • Cognitive resilience is the ability to keep a cool head, adapt and make better decisions in stressful situations and when faced with information overload.
  • Traditional methods of training cognitive resilience have shown their limitations, and these tools need to be modernised to better prepare people.
  • To do this, simulation and gamification enable more dynamic, interactive experiences that are closer to the current complex conditions of crises.
  • Gamification provides a framework that stimulates cognitive complexity, forcing decisions to be made in uncertain environments.
  • However, the development of these technologies raises ethical and deontological questions, such as data protection and the use of data without manipulative or conditioning abuses.

Faced with the accel­er­a­tion of increas­ing­ly com­plex and uncer­tain inter­na­tion­al crises, the issue of cog­ni­tive resilience has become cen­tral. This con­cept, which refers to the abil­i­ty to keep a cool head, adapt and make the best deci­sions in sit­u­a­tions of stress, infor­ma­tion over­load and ambi­gu­i­ty, is now a major chal­lenge for both mil­i­tary and civil­ian decision-makers.

Unfor­tu­nate­ly, tra­di­tion­al train­ing meth­ods based on fixed pro­ce­dures and rel­a­tive­ly pre­dictable sce­nar­ios are reveal­ing their lim­i­ta­tions. They do not suf­fi­cient­ly reflect the chang­ing, some­times chaot­ic real­i­ty in which actors must oper­ate today. This is where sim­u­la­tion and gam­i­fi­ca­tion come into play, offer­ing expe­ri­ences that are more dynam­ic, inter­ac­tive and, above all, clos­er to the real com­plex­i­ty of crises1.

The inadequacy of traditional simulations

Cri­sis man­age­ment train­ing has relied on sim­u­la­tors designed to repro­duce fair­ly lin­ear sit­u­a­tions where out­comes are rel­a­tive­ly pre­dictable. This approach is effec­tive when it comes to learn­ing pro­ce­dures or reflex­es, but proves insuf­fi­cient for prepar­ing for hybrid crises, cyber crises, or asym­met­ric con­flicts where infor­ma­tion is incom­plete, unclear, or even con­tra­dic­to­ry. What should be done if attacks simul­ta­ne­ous­ly hit sev­er­al vital infra­struc­tures such as the ener­gy2, telecom­mu­ni­ca­tions3 and infor­ma­tion4 sectors?

Under these con­di­tions, deci­sion-mak­ing becomes a real cog­ni­tive chal­lenge, influ­enced by men­tal load and emo­tions, as well as some­times uncon­scious bias­es5. These essen­tial human aspects are too often absent from tra­di­tion­al train­ing, which lim­its its effec­tive­ness in unpre­dictable sit­u­a­tions6. It is there­fore nec­es­sary to mod­ernise the tools at our dis­pos­al in order to be bet­ter prepared.

Major powers at the forefront of cognitive technologies

To meet these chal­lenges, sev­er­al coun­tries have devel­oped inno­v­a­tive solu­tions. In the Unit­ed States, for exam­ple, DARPA (Defense Advanced Research Projects Agency, a mil­i­tary tech­nol­o­gy research and devel­op­ment organ­i­sa­tion) is focus­ing on immer­sive sim­u­la­tors that incor­po­rate bio­met­ric sen­sors to mea­sure oper­a­tors’ cog­ni­tive load and emo­tion­al state in real time. This makes it pos­si­ble to auto­mat­i­cal­ly adjust the dif­fi­cul­ty of the exer­cis­es and offer per­son­alised train­ing7.

In Chi­na, the approach is even more ambi­tious, with the devel­op­ment of brain-machine inter­faces aimed at direct­ly increas­ing cog­ni­tive abil­i­ties, such as alert­ness or mem­o­ry, in high­ly com­plex sim­u­lat­ed envi­ron­ments8. In Europe, a more inter­dis­ci­pli­nary approach is being tak­en, com­bin­ing arti­fi­cial intel­li­gence, cog­ni­tive sci­ence and social analy­sis to bet­ter mod­el human deci­sion-mak­ing in cri­sis sit­u­a­tions9. How­ev­er, the inte­gra­tion of this research into train­ing sys­tems has yet to become widespread.

The devel­op­ment of these tech­nolo­gies rais­es eth­i­cal and pro­fes­sion­al con­duct issues, such as data use and protection.

Nev­er­the­less, the devel­op­ment of these tech­nolo­gies rais­es cru­cial eth­i­cal and deon­to­log­i­cal ques­tions. It is imper­a­tive to guar­an­tee data pro­tec­tion and ensure that these cog­ni­tive aug­men­ta­tion tools are used with­in a strict eth­i­cal frame­work, with­out manip­u­la­tive or con­di­tion­ing bias and in the inter­ests of cit­i­zens, i.e. not exclu­sive­ly for com­mer­cial or mil­i­tary pur­pos­es10.

Gamification: more than just a fun tool

Gam­i­fi­ca­tion is some­times seen as a way to make learn­ing more fun, but its poten­tial goes far beyond that. When well designed, it pro­vides a frame­work that sim­u­lates cog­ni­tive com­plex­i­ty, forc­ing par­tic­i­pants to make deci­sions in uncer­tain envi­ron­ments, with con­se­quences that can be unpre­dictable11. These seri­ous games, par­tic­u­lar­ly those offer­ing mul­ti­ple-branch sce­nar­ios, are effec­tive train­ing grounds for devel­op­ing adapt­abil­i­ty, stress man­age­ment and deci­sion-mak­ing under pres­sure12. In cyber cri­sis train­ing, for exam­ple, their effec­tive­ness has been con­firmed by sev­er­al stud­ies13.

Fur­ther­more, the inte­gra­tion of gen­er­a­tive arti­fi­cial intel­li­gence into the design of these seri­ous games paves the way for even more dynam­ic sce­nar­ios, where AI can gen­er­ate events, char­ac­ters or twists in real time, mak­ing each train­ing ses­sion unique and requir­ing increased cog­ni­tive adapt­abil­i­ty14.

French innovations: discreet but concrete work

Between 2022 and 2024, SRAT (Sys­tem Risk Assess­ment and Tech­nol­o­gy) car­ried out sev­er­al projects that per­fect­ly illus­trate this dynam­ic. The team devel­oped a sim­u­la­tor based on the Uni­ty engine, recre­at­ing a cri­sis man­age­ment cen­tre in a con­flict zone where users’ deci­sions influ­ence the course of events, thus intro­duc­ing the notion of uncer­tain­ty and unpre­dictabil­i­ty. Cou­pled with an elec­troen­cephalog­ra­phy (EEG) inter­face, this sim­u­la­tor mea­sures users’ cog­ni­tive load and emo­tions in real time, pro­vid­ing valu­able feed­back on their deci­sion-mak­ing mech­a­nisms. At the same time, SRAT worked on the C‑RAND project, a bench­mark for emerg­ing tech­nolo­gies in cyber-cog­ni­tive inter­faces, in col­lab­o­ra­tion with the British Army. This work enabled the selec­tion of the most suit­able solu­tions for inte­gra­tion into train­ing and oper­a­tions sys­tems15.

Anoth­er notable project is the devel­op­ment of a dig­i­tal seri­ous game in HTML, used to train both Master’s stu­dents and offi­cers at the École Mil­i­taire Inter­armes. This game sim­u­lates a cyber cri­sis with mul­ti­ple ram­i­fi­ca­tions, forc­ing play­ers to man­age infor­ma­tion over­load and make quick and rel­e­vant deci­sions. Feed­back shows a sig­nif­i­cant improve­ment in adapt­abil­i­ty and stress man­age­ment skills16. Final­ly, SRAT has pro­vid­ed more than 100 hours of teach­ing; com­bin­ing arti­fi­cial intel­li­gence, cog­ni­tive sci­ence and strat­e­gy with­in the High­er Mil­i­tary Sci­en­tif­ic and Tech­ni­cal Edu­ca­tion (EMSST), thus prepar­ing future exec­u­tives for the cog­ni­tive chal­lenges of con­tem­po­rary crises.

Sim­u­la­tion and gam­i­fi­ca­tion are strate­gic tools that can trans­form train­ing and increase the cog­ni­tive resilience of decision-makers.

To max­imise the impact of these tools on learn­er train­ing, the role of debrief­ing and struc­tured feed­back is cru­cial. Post-sim­u­la­tion analy­sis, where the data col­lect­ed (cog­ni­tive load, emo­tions, deci­sions) is used for per­son­alised feed­back, enables par­tic­i­pants to under­stand their bias­es and inte­grate their learn­ing in a last­ing way.

Towards enhanced cognitive sovereignty

These dif­fer­ent approach­es clear­ly show that sim­u­la­tion and gam­i­fi­ca­tion are not mere gad­gets, but strate­gic tools capa­ble of trans­form­ing train­ing and increas­ing the cog­ni­tive resilience of deci­sion-mak­ers. In a world where infor­ma­tion is a lever of pow­er and where the speed of deci­sion-mak­ing can make the dif­fer­ence between suc­cess and fail­ure, it is essen­tial to strength­en this abil­i­ty to adapt and make deci­sions under stress17. Cog­ni­tive sov­er­eign­ty, like tech­no­log­i­cal sov­er­eign­ty, is now a mat­ter of nation­al security.

It is also impor­tant to con­sid­er that cog­ni­tive resilience is not sole­ly the pre­serve of deci­sion-mak­ers; an approach open to every cit­i­zen would raise aware­ness and pre­pare a wider audi­ence for the cog­ni­tive chal­lenges of con­tem­po­rary crises (dis­in­for­ma­tion, rumours, emer­gen­cies). In the cur­rent con­text of hybrid attacks aimed at spread­ing dis­in­for­ma­tion, stir­ring up hatred and break­ing nation­al cohe­sion, rais­ing aware­ness is a pow­er­ful lever of resis­tance that needs to be developed.

But it is not just about tech­nol­o­gy. Cog­ni­tive resilience is a com­plex process that requires an inte­grat­ed approach, com­bin­ing tech­nol­o­gy, human under­stand­ing and appro­pri­ate edu­ca­tion. The work of SRAT in France is lead­ing the way, with a prag­mat­ic, rig­or­ous approach that is close­ly aligned with the real needs of the armed forces. The chal­lenge now is to ensure the wider dis­sem­i­na­tion and adop­tion of these inno­va­tions, so that cog­ni­tive train­ing becomes an oper­a­tional real­i­ty and an asset in the man­age­ment of tomor­row’s crises.

Sim­u­la­tion and gam­i­fi­ca­tion open up excit­ing prospects for meet­ing the chal­lenge of cog­ni­tive resilience. More than ever, these tools must be seen as essen­tial levers for prepar­ing deci­sion-mak­ers for the com­plex and uncer­tain real­i­ties of mod­ern crises. Com­bin­ing tech­no­log­i­cal advances with a human-cen­tred approach is the key to build­ing true cog­ni­tive sov­er­eign­ty, guar­an­tee­ing effi­cien­cy and secu­ri­ty in a rapid­ly chang­ing world.

1Salas, E., Bow­ers, C. A., & Rho­d­eniz­er, L. (2017). It is not how much you prac­tice, but how you prac­tice: Toward a new par­a­digm in train­ing research. Psy­cho­log­i­cal Sci­ence, 8(4), 270–276.
2IT-Pro. (2025, April 9). Men­aces cyber sur le secteur énergé­tique européen ! iTPro​.fr. https://​www​.itpro​.fr/​m​e​n​a​c​e​s​-​c​y​b​e​r​-​s​u​r​-​l​e​-​s​e​c​t​e​u​r​-​e​n​e​r​g​e​t​i​q​u​e​-​e​u​r​o​peen/
3Les Echos (2024, Novem­ber 18). Câble télé­com rompu : Berlin et Helsin­ki évo­quent la « guerre hybride » et la men­ace russe. Les Echos. https://​www​.lese​chos​.fr/​m​o​n​d​e​/​e​n​j​e​u​x​-​i​n​t​e​r​n​a​t​i​o​n​a​u​x​/​c​a​b​l​e​-​t​e​l​e​c​o​m​-​r​o​m​p​u​-​b​e​r​l​i​n​-​e​t​-​h​e​l​s​i​n​k​i​-​e​v​o​q​u​e​n​t​-​l​a​-​g​u​e​r​r​e​-​h​y​b​r​i​d​e​-​e​t​-​l​a​-​m​e​n​a​c​e​-​r​u​s​s​e​-​2​1​32308
4Mar­tin, O. (2024, July 25). Cam­pagnes de dés­in­for­ma­tion, cyber­at­taques, ingérences… le com­bat hybride a déjà com­mencé, à nous d’y faire face. Le Figaro. https://​www​.lefi​garo​.fr/​v​o​x​/​m​o​n​d​e​/​l​e​-​c​o​m​b​a​t​-​h​y​b​r​i​d​e​-​a​-​d​e​j​a​-​c​o​m​m​e​n​c​e​-​a​-​n​o​u​s​-​d​-​y​-​f​a​i​r​e​-​f​a​c​e​-​2​0​2​40725
5Kah­ne­man, D. (2011). Think­ing, fast and slow. Far­rar, Straus and Giroux.
6Para­sur­a­man, R., & Riley, V. (1997). Humans and automa­tion: Use, mis­use, dis­use, abuse. Human Fac­tors, 39(2), 230–253.
7Gon­za­lez, C., & al. (2021). Adap­tive cog­ni­tive work­load assess­ment using EEG and eye track­ing in cri­sis sim­u­la­tions. Neu­roIm­age, 236, 118070.
8Wang, Y., Li, X., & Zhang, Y. (2022). Brain-com­put­er inter­faces for enhanc­ing cog­ni­tive per­for­mance: A review. Fron­tiers in Neu­ro­science, 16, 987654.
9Bour­ri­er, M., & Bén­aben, F. (2021). Cog­ni­tive sys­tems and cri­sis man­age­ment: An inte­gra­tive frame­work. Cog­ni­tion, Tech­nol­o­gy & Work, 23(1), 15–29.
10Shults, F.L., Wild­man, W.J. (2019). Ethics, Com­put­er Sim­u­la­tion, and the Future of Human­i­ty. In: Dial­lo, S., Wild­man, W., Shults, F., Tolk, A. (eds) Human Sim­u­la­tion: Per­spec­tives, Insights, and Appli­ca­tions. New Approach­es to the Sci­en­tif­ic Study of Reli­gion, vol 7. Springer, Cham. https://doi.org/10.1007/978–3‑030–17090-5_2
11Michael, D., & Chen, S. (2006). Seri­ous games: Games that edu­cate, train, and inform. Mus­ka & Lip­man/Premier-Trade.
12Con­nol­ly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A sys­tem­at­ic lit­er­a­ture review of empir­i­cal evi­dence on com­put­er games and seri­ous games. Com­put­ers & Edu­ca­tion, 59(2), 661–686.
13Ander­son, J., & Rainie, L. (2021). The future of cri­sis sim­u­la­tion and train­ing. Jour­nal of Cog­ni­tive Engi­neer­ing and Deci­sion Mak­ing, 15(2), 90–108.
14
Eun S‑J., Eun J. K., Kim, JY., Arti­fi­cial intel­li­gence-based per­son­al­ized seri­ous game for enhanc­ing the phys­i­cal and cog­ni­tive abil­i­ties of the elder­ly, Future Gen­er­a­tion Com­put­er Sys­tems, Vol­ume 141, 2023, Pages 713–722, ISSN 0167–739X,
https://​doi​.org/​1​0​.​1​0​1​6​/​j​.​f​u​t​u​r​e​.​2​0​2​2​.​1​2.017
15
Lan­glois-Berth­elot, L. (2024). Syn­thèse des inno­va­tions en inter­faces cyber-cog­ni­tives pour l’Armée de Terre. SRAT
16
Lan­glois-Berth­elot, L. (2023). Rap­port interne sur la sim­u­la­tion et la gam­i­fi­ca­tion dans la ges­tion de crise.
SRAT
17Woods, D. D., & Holl­nagel, E. (2019). Resilience engi­neer­ing: Con­cepts and pre­cepts. CRC Press.

Our world through the lens of science. Every week, in your inbox.

Get the newsletter