Home / Chroniques / Can we develop our intuition to counter misinformation?
A chilling collage of human figures with retro TV heads, standing zombie-like, portraying censorship, disinformation, and the blind following of mass media.
π Digital π Society π Neuroscience

Can we develop our intuition to counter misinformation ?

Patrice Georget
Patrice Georget
Lecturer in Psychosociology at the University School of Management IAE Caen
Key takeaways
  • Disinformation, the intentional production and distribution of fake news with the aim of causing harm, raises the question of trust in sources.
  • These practices create information chaos, threaten democratic life and reduce people's critical faculties in favour of dichotomous thinking.
  • Combating disinformation through legal regulation raises the question of the balance between freedom of expression and censorship.
  • To understand why and how disinformation spreads, we need to study the concept of epistemic beliefs.
  • To avoid falling into the trap, it is important to fight against one's intuition, to trust evidence more than one's own opinion and to look beyond one's socio-political ideologies.

Pro­pa­gan­da is a glo­bal stra­te­gy put in place by a state, an ins­ti­tu­tion or a com­mu­ni­ty to des­ta­bi­lise a tar­get. Mis­in­for­ma­tion is the unin­ten­tio­nal sha­ring of fake news, erro­neous or obso­lete infor­ma­tion, through error, or lack of vigi­lance or know­ledge of the sub­ject : there is no actual inten­tion to cause harm here. Dis­in­for­ma­tion, on the other hand, is a pro­pa­gan­da tool that works by deli­be­ra­te­ly crea­ting and sha­ring false infor­ma­tion with the inten­tion of cau­sing harm. This article focuses on dis­in­for­ma­tion because, in addi­tion to the ques­tion of the truth of infor­ma­tion, this concept raises the issue of vera­ci­ty, and the­re­fore of trust in infor­ma­tion sources. We will argue that com­ba­ting mis­in­for­ma­tion means asking three ques­tions about know­ledge : what to trust, how to trust and who to trust.

The breeding ground for disinformation : our society’s present-day vulnerabilities

The rise of dis­in­for­ma­tion on social net­works is gene­ra­ting infor­ma­tio­nal chaos that threa­tens demo­cra­tic life : satu­ra­tion of auto­ma­ted adver­ti­sing and har­ves­ted data, pro­mo­tion of sho­cking and conspi­ra­cy-orien­ted infor­ma­tion, dis­cre­di­ting of autho­ri­ty figures, algo­rith­mic logic at the source of thought bubbles. “For example, 120,000 years’ worth of videos are vie­wed on You­Tube eve­ry day. Of these, 70% are wat­ched because the platform’s arti­fi­cial intel­li­gence recom­mends them”1. Social net­works have also come to be seen as one of the most reliable means of consul­ting the news2. The mis­in­for­ma­tion of young people in par­ti­cu­lar sends out wor­rying signals : one in 4 young French people sub­scribe to crea­tio­nist theo­ries, 16% think that the earth could well be flat, 20% that the Ame­ri­cans have never been to the moon, and 49% that astro­lo­gy is a science. A large pro­por­tion of them believe that an influencer’s popu­la­ri­ty is a gua­ran­tee of relia­bi­li­ty (repre­sen­ta­tive sample aged bet­ween 18 and 24)3. Trust in science is strong and stable in all Euro­pean coun­tries, except in France where it has fal­len by 20 per­cen­tage points in 18 months4. This drop in confi­dence in science is cor­re­la­ted with sup­port for fake news and conspi­ra­cy theo­ries5. At the same time, illec­tro­nism (a contrac­tion of illi­te­ra­cy and elec­tro­nics) is crea­ting a new area of exclu­sion, with 14 mil­lion French people expe­rien­cing dif­fi­cul­ties in using digi­tal tools, at a time when dema­te­ria­li­sa­tion is beco­ming wides­pread6.

These vul­ne­ra­bi­li­ties, com­bi­ned with power­ful forces of influence, have dama­ging effects on our demo­cra­cies : redu­ced cri­ti­cal thin­king and cre­du­li­ty on the part of citi­zens, inabi­li­ty to resist seduc­tion by, and sup­port for, dubious ideas, selec­tive expo­sure to infor­ma­tion and pre­va­lence of hypo­the­sis-confir­ma­tion bias, dicho­to­mous thin­king and redu­ced abi­li­ty to make argu­ments7. Admit­ted­ly, these flaws are nothing new (cf. Orson Welles’ radio hoax “War of the Worlds”), but the infil­tra­tion of supra-natio­nal powers, the power of tech­no­lo­gi­cal tools and the avai­la­bi­li­ty of our slum­be­ring brains make this a cri­ti­cal risk.

The levers for com­ba­ting dis­in­for­ma­tion and mis­in­for­ma­tion are the­re­fore a prio­ri­ty for our demo­cra­cies. They fall into two dis­tinct cate­go­ries : limi­ting the pro­duc­tion and dis­se­mi­na­tion of fake news, and limi­ting its impact.

Can we limit the production of misinformation : regulation and moderation ?

350,000 mes­sages are pos­ted on X (for­mer­ly Twit­ter) eve­ry minute, for 250 mil­lion active users. There are an esti­ma­ted 2,000 mode­ra­tors, i.e. one mode­ra­tor for eve­ry 175,000 users8. The same infla­tion is obser­ved for other social net­works. These figures call into ques­tion the very pos­si­bi­li­ty of mode­ra­ting infor­ma­tion, which is increa­sin­gly mana­ged by algo­rithms, a black box whose trans­pa­ren­cy is often ques­tio­ned9. Elon Musk, via his com­pa­ny X, filed a suit against Cali­for­nia on 8 Sep­tem­ber 2023, accu­sing the Ame­ri­can state of hin­de­ring free­dom of expres­sion by for­cing plat­forms to be trans­pa­rent about content moderation.

To be a scien­tist is to fight one’s brain

Legal regu­la­tion (ARCOM, DSA) is now being deba­ted, and poli­ti­cal ins­ti­tu­tions are taking up the issue, but the balance bet­ween free­dom of expres­sion and cen­sor­ship has not yet been achie­ved. In France, the Auto­ri­té de Régu­la­tion de la Com­mu­ni­ca­tion Audio­vi­suelle et Numé­rique (ARCOM) acts effec­ti­ve­ly but remains limi­ted in terms of resources, since it has 355 employees wor­king on a wide range of issues (pro­tec­tion of audiences, media edu­ca­tion, res­pect for copy­right, infor­ma­tion ethics, super­vi­sion of online plat­forms, deve­lop­ments in radio and digi­tal audio, VOD dis­tri­bu­tion). With the Digi­tal Social Act, Europe is put­ting a sys­tem of accoun­ta­bi­li­ty in place for the major plat­forms as from 2024, based on a simple prin­ciple : what is ille­gal offline is ille­gal online. The aim is to pro­tect Inter­net users by a num­ber of prac­ti­cal means : making the way in which the recom­men­da­tion algo­rithm works acces­sible to users, as well as the pos­si­bi­li­ty of deac­ti­va­ting it, jus­ti­fying mode­ra­tion deci­sions, set­ting up an expli­cit mecha­nism for repor­ting content, and allo­wing appeals. Cer­tain types of tar­ge­ted adver­ti­sing will be ban­ned. Penal­ties for non-com­pliant plat­forms are set to match the sta­ted ambi­tions : 6% of glo­bal turnover.

The fact remains, howe­ver, that if we take into account the vul­ne­ra­bi­li­ties men­tio­ned above, the rapid growth in the amount of infor­ma­tion exchan­ged and the dif­fi­cul­ties in regu­la­ting and mode­ra­ting plat­forms, a com­ple­men­ta­ry approach is nee­ded : not just limi­ting mis­in­for­ma­tion, but redu­cing its impact on its tar­gets by streng­the­ning their capa­ci­ty to resist. But how do we know whe­ther a piece of infor­ma­tion is true ?

How do we know we know something : epistemic beliefs

Epis­te­mic beliefs relate to the ideas we have about know­ledge and the pro­cesses by which know­ledge is crea­ted : what makes us think we know things ? What fac­tors contri­bute to a mis­per­cep­tion of know­ledge ? These ques­tions are cen­tral to unders­tan­ding the spread and impact of mis­in­for­ma­tion, as well as ways of coun­te­ring it.

Kel­ly Gar­rett and Brian Weeks, from Ohio and Michi­gan Uni­ver­si­ties, car­ried out a vast stu­dy in the Uni­ted States in 2017 with the aim of gai­ning a bet­ter unders­tan­ding of some of the deter­mi­ning fac­tors in adhe­rence to mis­in­for­ma­tion and conspi­ra­cy theo­ries. Ini­tial­ly, they mea­su­red the opi­nions of par­ti­ci­pants on contro­ver­sial sub­jects in cer­tain conspi­ra­cy net­works : the fact that the Apol­lo Mis­sion never went to the moon, that AIDS was an inten­tio­nal crea­tion to harm the homo­sexual com­mu­ni­ty, that the 9/11 attacks were autho­ri­sed by the US admi­nis­tra­tion to jus­ti­fy poli­ti­cal deci­sions (mili­ta­ry inva­sion and reduc­tion of civil rights), or that JFK, Luther King or Prin­cess Dia­na were assas­si­na­ted on the orders of ins­ti­tu­tions (govern­ments or secret agen­cies). They also mea­su­red par­ti­ci­pants’ opi­nions on high­ly sen­si­tive contem­po­ra­ry social issues where there is a coun­ter-dis­course to the cur­rent scien­ti­fic consen­sus : the role of human acti­vi­ty in glo­bal war­ming, or the fact that cer­tain vac­cines cause diseases such as autism.

This data was cor­re­la­ted with other mea­sures of the same par­ti­ci­pants’ epis­te­mic beliefs. The results are indis­pu­table : par­ti­ci­pants are more like­ly to sub­scribe to conspi­ra­cy theo­ries and are more sus­pi­cious of scien­ti­fic dis­course the more :

  • they trust their intui­tions to “feel” the truth of things,
  • they believe that facts are not suf­fi­cient to call into ques­tion what they believe to be true,
  • they consi­der that all truth is rela­tive to a poli­ti­cal context.

Since this stu­dy, a great deal of research has shown the extent to which these three ele­ments consti­tute vul­ne­ra­bi­li­ties in the fight against dis­in­for­ma­tion. In the next article, we will explain how each of these epis­te­mic beliefs works, in order to iden­ti­fy the psy­cho­so­cial skills we need to deve­lop to shar­pen our cri­ti­cal thin­king skills.

What to trust : the intuition trap

The first impor­tant result of the stu­dy by Kel­ly Gar­rett and Brian Weeks concerns the trust pla­ced in our intui­tion to unders­tand the world around us, with the strong idea that cer­tain truths are not acces­sible ratio­nal­ly. Ins­tinct, first impres­sions and a dif­fuse “gut” fee­ling are said to be excellent indi­ca­tors for gui­ding our jud­ge­ments and deci­sions. This epis­te­mic belief is wide­ly held today in mains­tream publi­ca­tions and per­so­nal deve­lop­ment methods : “enter the magic of intui­tion”; “deve­lop your 6th sense”; “manage with intui­tion”; “the powers of intui­tion”: These titles sup­port the idea that there is a “lit­tle je ne sais quoi” that allows us to access hid­den truths and unders­tand the world direct­ly by “recon­nec­ting” with our­selves and our envi­ron­ment (the cos­mos, pseu­do quan­tum vibra­tions, etc.). Ins­pi­red by the New Age10, these approaches, which often relate to health and well-being, do not shy away from advo­ca­ting a return to com­mon sense and our abi­li­ty to know things in an emo­tio­nal way, without the need for proof, thanks to a “gift”. Yet science has often deve­lo­ped contra­ry to com­mon sense and first intui­tions : a hea­vy body does not fall fas­ter than a light one, hot water freezes fas­ter than cold water…

Admit­ted­ly, scien­ti­fic research does not ques­tion the role of intui­tive know­ledge, and nume­rous works and publi­ca­tions are devo­ted to it11, many of them in medi­cine under the aegis of “Gut Fee­ling”12. But what this cog­ni­tive science research says is very dif­ferent from what we find in per­so­nal deve­lop­ment books, pri­ma­ri­ly because intui­tion is des­cri­bed as a form of rea­so­ning that is part of a fair­ly ratio­nal pro­cess. In fact, scien­tists have shown (with the help of empi­ri­cal research car­ried out with pro­fes­sio­nals who have deve­lo­ped intui­tive know­ledge, such as com­pa­ny direc­tors, doc­tors, fire­men, chess players, sports­men and sol­diers) that intui­tion is all the more effec­tive in experts who have had a great deal of pre­vious expe­rience, thanks to oppor­tu­ni­ties to make hypo­theses based on the ana­ly­sis of their envi­ron­ment, to test them in a real situa­tion, to bene­fit from feed­back (suc­cess or fai­lure), to make cor­rec­tions, to retest, etc.… until they arrive at an impli­cit, effi­cient and rapid know-how known as intui­tion. There’s nothing eso­te­ric or “quan­tum” about it, but prac­tice, dis­ci­pline, and feed­back13 enable us to make rapid deci­sions when the context demands it. If 82% of Nobel Prize win­ners ack­now­ledge that their dis­co­ve­ries were made thanks to their intui­tion14, it is above all because they have accu­mu­la­ted such a wealth of scien­ti­fic know­ledge and metho­do­lo­gi­cal expe­rience that they end up aggre­ga­ting clus­ters of clues to arrive at an insight : “eure­ka”!

The first psy­cho­so­cial skill to deve­lop in the fight against mis­in­for­ma­tion is the­re­fore to dis­trust one’s own intui­tions by resis­ting one­self15 : “to be a scien­tist is to fight one’s brain” said Gas­ton Bache­lard. It’s not a ques­tion of sup­pres­sing our intui­tions, but rather of taking the neces­sa­ry time to ques­tion them, audit them and vali­date their basis, and in this way to car­ry out meta­cog­ni­tive work on our­selves in a non-indul­gent and modest way : on what past expe­rience is my intui­tion based, have I had the oppor­tu­ni­ty to have a lot of feed­back on the effects of my actions lin­ked to this intui­tion, and to what extent am I not being influen­ced by my desires, my emo­tions or my envi­ron­ment ? This is all the more dif­fi­cult because an impres­sion is above all… impres­sive : what mat­ters most is not so much its content as the men­tal pro­cess of its construc­tion and its conse­quences on the way we think and act16.

How to trust : the method

The second impor­tant result of Kel­ly Gar­rett and Brian Weeks’ stu­dy relates to the impor­tance we attach to consis­ten­cy bet­ween facts and opi­nions. Put ano­ther way, can we main­tain a belief in the face of a demons­tra­tion that contra­dicts it ? Some of us need fac­tual evi­dence to form an opi­nion, dis­trust appea­rances and are concer­ned about the method used to pro­duce data. Others not so much : the stu­dy men­tio­ned above shows that the lat­ter are much more like­ly to sub­scribe to false infor­ma­tion and conspi­ra­cy theo­ries. We remem­ber the “alter­na­tive facts” the day after Trump’s elec­tion, symp­to­ma­tic of the post-truth era. These stra­te­gies for dis­tor­ting rea­li­ty are only pos­sible because they find an audience who, while not being foo­led by them, do not feel the need for consis­ten­cy bet­ween facts and beliefs. On the contra­ry, the cohe­rence they seek tends to adjust the facts in favour of their beliefs, a ratio­na­li­sa­tion effect that is well known from work on cog­ni­tive dis­so­nance. Hugo Mer­cier and Dan Sper­ber17have recent­ly exa­mi­ned this issue in a book which defends the the­sis that our rea­son serves us above all… to be right, not only in rela­tion to others, but also in rela­tion to our­selves ! Hence the cog­ni­tive biases with a self-jus­ti­fying func­tion : hypo­the­sis confir­ma­tion, ancho­ring, loss aver­sion, retros­pec­tive bias, etc.18. It’s easy to see why com­ba­ting this is a daun­tin­gly com­plex task, yet one that is both neces­sa­ry and pos­sible if we make the effort to teach the scien­ti­fic method and its com­po­nents, and not just to stu­dents des­ti­ned for scien­ti­fic careers ! These alter­na­tive facts call into ques­tion the very notion of truth and know­ledge reco­gni­sed as fair19, and lead to the sor­did conclu­sion that science is an opi­nion like any other20 : this pos­ture under­mines the very foun­da­tions of our demo­cra­tic ins­ti­tu­tions, which is why know­ledge of the scien­ti­fic method has now become a com­mon good and a genuine psy­cho-social skill accor­ding to the WHO : “abi­li­ties that enable the deve­lop­ment not only of indi­vi­dual well-being, but also of construc­tive social interactions”

Who to trust : back to basics

The latest result from Kel­ly Gar­rett and Brian Weeks’ stu­dy shows that the more indi­vi­duals believe that facts are dependent on the poli­ti­cal power in place or the socio-poli­ti­cal context in which they are pro­du­ced, the more rea­di­ly they adhere to dis­in­for­ma­tion and conspi­ra­cy theo­ries. This type of epis­te­mic belief, which is reso­lu­te­ly rela­ti­vis­tic, is faci­li­ta­ted by the fact that our beliefs also serve to rein­force our iden­ti­fi­ca­tions with the groups to which we belong : we eva­luate the infor­ma­tion to which we are expo­sed accor­ding to our socio-ideo­lo­gi­cal proxi­mi­ty to its source. The under­lying pro­blem here is the­re­fore one of vera­ci­ty rather than truth : it is a ques­tion of the moral qua­li­ty of the author of a piece of infor­ma­tion and the­re­fore of the trust we place in him or her. Fran­cis Wolff21 shows that this rela­ti­vist stance is now a stum­bling block in the fight against the risks com­mon to huma­ni­ty as a whole (glo­bal war­ming, eco­no­mic cri­sis, shor­tage of resources, extinc­tion of spe­cies, epi­de­mics, ter­ro­rism, etc.) because of local demands (iden­ti­ty-based, com­mu­ni­ta­rian, natio­na­list, xeno­pho­bic, reli­gious radi­ca­lism, etc.) that ham­per our abi­li­ty to engage in dia­logue and find ways of moving for­ward col­lec­ti­ve­ly. So what psy­cho-social skills do we need to deve­lop if we are to know who we can trust and build com­mon pro­jects that trans­cend com­mu­ni­ta­rian divi­sions ? To ans­wer this ques­tion, Phi­lippe Bre­ton22 car­ried out a num­ber of empi­ri­cal stu­dies during expe­ri­men­tal argu­men­ta­tion work­shops. His results sug­gest that we need to deve­lop what he calls a “demo­cra­tic abi­li­ty”, which is cur­rent­ly far too lacking to build trust, and which is based on three skills :

  • Spea­king in front of others : prac­ti­sing over­co­ming the fear of spea­king in front of an unfa­mi­liar group. Scien­ti­fic research shows that this fear is one of the most wides­pread among adults (55%23). This fear hin­ders the very pos­si­bi­li­ty of esta­bli­shing the condi­tions for cooperation.
  • Cog­ni­tive empa­thy : prac­ti­sing defen­ding opi­nions contra­ry to one’s own. The aim is to learn to iden­ti­fy the qua­li­ty of the argu­ments and thus regu­late one’s less solid epis­te­mic beliefs. This stra­te­gy is part of the psy­cho­lo­gi­cal ino­cu­la­tion methods24 desi­gned to streng­then men­tal immunity.
  • Com­bat “consen­sual pala­ver”: soft consen­sus is a way of avoi­ding debate that gives the illu­sion of brin­ging people toge­ther. Prac­ti­sing “frank and pea­ce­ful conflic­tua­li­ty”25 is not easy, but it does pro­vide the neces­sa­ry demo­cra­tic vitality.

Conclusion

“Il faut voir comme on se parle. Mani­feste pour les arts de la parole” (We need to look at how we talk to each other. A Mani­fes­to for the Arts of Speech) is the title of the latest book by Gérald Garut­ti, foun­der of the “Centre des Arts de la Parole”, a third-par­ty centre that res­tores the psy­cho-social skills nee­ded to build a space for sha­red dia­logue and com­bat the mis­in­for­ma­tion that is under­mi­ning our demo­cra­cies. These third-par­ty centres, spaces for science and dis­co­ve­ry, and citi­zens’ labo­ra­to­ries for expe­ri­men­ta­tion, share the com­mon goal of deve­lo­ping demo­cra­tic skills in the form of ope­ra­tio­nal know-how : kno­wing how to argue and make a coun­te­rar­gu­ment, kno­wing how to lis­ten, sus­pen­ding one’s jud­ge­ment and eli­ci­ting that of others. They also help us to unders­tand how scien­ti­fic truth is construc­ted and how this know­ledge can be bia­sed : these are the levers of free will and living together.

1BRONNER Gérald, (2022). Les lumières à l’ère numé­rique, Presses Uni­ver­si­taires de France.
2WATSON Any, (2021). Share of adults who trust selec­ted news sources world­wide in 2018, by region. Sta­tis­ta. https://​www​.sta​tis​ta​.com/​s​t​a​t​i​s​t​i​c​s​/​9​6​7​3​5​6​/​n​e​w​s​-​s​o​u​r​c​e​s​-​t​r​u​s​t​w​o​r​t​h​i​n​e​s​s​-​w​o​r​l​d​wide/
3KRAUS Fran­çois, LEE BOUYGUES Helen, REICHSTADT Rudy, (2023). La més­in­for­ma­tion scien­ti­fique des jeunes à l’heure des réseaux sociaux. Fon­da­tion Jean Jau­rès, publi­ca­tion du 12 jan­vier 2023. https://​www​.jean​-jaures​.org/​p​u​b​l​i​c​a​t​i​o​n​/​l​a​-​m​e​s​i​n​f​o​r​m​a​t​i​o​n​-​s​c​i​e​n​t​i​f​i​q​u​e​-​d​e​s​-​j​e​u​n​e​s​-​a​-​l​h​e​u​r​e​-​d​e​s​-​r​e​s​e​a​u​x​-​s​o​c​iaux/
4ALGAN Yann, COHEN Daniel, DAVOINE Eva, FOUCAULT Mar­tial et STANTCHEVA Ste­fa­nie, (2021). Confiance dans les scien­ti­fiques par temps de crise. Conseil d’analyse éco­no­mique, n°068‑2021, 8 pages.
5GARETT R.K. & WEEKS, B.E. (2017). Epis­te­mic beliefs’ role in pro­mo­ting mis­per­cep­tions and conspi­ra­cist idea­tion. Weeks BE, PLOS ONE 12(9): e0184733. https://​doi​.org/​1​0​.​1​3​7​1​/​j​o​u​r​n​a​l​.​p​o​n​e​.​0​1​84733
6Ber­cy Numé­rique, (2023). L’illec­tro­nisme : frac­ture numé­rique et frac­ture sociale ? https://​www​.ber​cy​nu​me​rique​.finances​.gouv​.fr/​l​i​l​l​e​c​t​r​o​n​i​s​m​e​-​f​r​a​c​t​u​r​e​-​n​u​m​e​r​i​q​u​e​-​e​t​-​f​r​a​c​t​u​r​e​-​s​o​ciale
7BRONNER Gérald, (2021). Apo­ca­lypse cog­ni­tive. Presses uni­ver­si­taires de France.
8Digi­mind (2023) https://​blog​.digi​mind​.com/​f​r​/​t​e​n​d​a​n​c​e​s​/​t​w​i​t​t​e​r​-​c​h​i​f​f​r​e​s​-​e​s​s​e​n​t​i​e​l​s​-​f​r​a​n​c​e​-​m​o​n​d​e​-2020
9Les Echos, 24 avril 2023, Les algo­rithmes des réseaux sociaux res­tent une boîte noire.  https://​www​.lese​chos​.fr/​t​e​c​h​-​m​e​d​i​a​s​/​h​i​g​h​t​e​c​h​/​l​e​s​-​a​l​g​o​r​i​t​h​m​e​s​-​d​e​s​-​r​e​s​e​a​u​x​-​s​o​c​i​a​u​x​-​r​e​s​t​e​n​t​-​u​n​e​-​b​o​i​t​e​-​n​o​i​r​e​-​1​9​36126
10MARQUIS, N. (2017). Les impasses du déve­lop­pe­ment per­son­nel : L’obsession de la quête de soi. Revue du Crieur, 7, 38–53. https://​doi​.org/​1​0​.​3​9​1​7​/​c​r​i​e​u​.​0​0​7​.0038
11GIGERENZER, G. (2007). Le génie de l’intuition. Paris, Pocket.
12LECOINTRE, C. (2020). Intui­tion : génie ou folie ? réflexions autour de l’usage et de la légi­ti­mi­té de l’intuition dans le soin en pédia­trie. Erès, Revue Fran­çaise d’Ethique Appli­quée, 1–9, 129–143. Et PERNIN, T., BOURRILLON, A. STOLPER, E. & BAUMANN, L. (2017). Vers un consen­sus sur le Gut Fee­ling aux urgences pédia­triques fran­çaises. Méde­cine, Mai 2017, pp. 221–227.
13KAHNEMAN, D., SIBONY, O. & SUNSTEIN, C.R. (2021). NOISE. Pour­quoi nous fai­sons des erreurs de juge­ment et com­ment les évi­ter. Odile jacob.
14SENDER, E. (2016). Intui­tion : le cer­veau en roue libre. Sciences et Ave­nir, 827, Jan­vier.
15HOUDE, O. (2022). Apprendre à résis­ter. Pour com­battre les biais cog­ni­tifs. Paris, Flam­ma­rion.
16LE POULTIER, F. (2020). Com­ment navi­guer dans les eaux troubles d’un océan d’absurdités. Rennes, Presses Uni­ver­si­taires de Rennes
17MERCIER, H. & SPERBER, D. (2021). L’énigme de la rai­son. Paris, Odile Jacob.
18GEORGET, P. (2021). Les biais cog­ni­tifs sont-ils com­pa­tibles avec la méthode scien­ti­fique ? Poly­tech­nique Insights, 6 juillet 2021 : https://​www​.poly​tech​nique​-insights​.com/​t​r​i​b​u​n​e​s​/​s​o​c​i​e​t​e​/​l​e​s​-​b​i​a​i​s​-​c​o​g​n​i​t​i​f​s​-​s​o​n​t​-​i​l​s​-​c​o​m​p​a​t​i​b​l​e​s​-​a​v​e​c​-​l​a​-​m​e​t​h​o​d​e​-​s​c​i​e​n​t​i​f​ique/
19ESQUERRE, A. (2018). Le ver­tige des faits alter­na­tifs. Qu’est deve­nue la véri­té ? Paris, Edi­tion Tex­tuel
20KLEIN, E. (2020). Le gout du vrai. Tracts Gal­li­mard.
21WOLFF, F. (2019). Plai­doyer pour l’universel. Paris, Fayard.
22BRETON, P. (2006). L’incompétence démo­cra­tique. Édi­tions La Décou­verte.
23ANDRE, C., LEGERON, P. & PELISSOLO, A. (2023). La nou­velle peur des autres. Paris, Odile Jacob.
24COOK J, LEWANDOWSKY S, ECKER UKH (2017) Neu­tra­li­zing mis­in­for­ma­tion through ino­cu­la­tion : Expo­sing mis­lea­ding argu­men­ta­tion tech­niques reduces their influence. PLOS ONE 12(5): e0175799. https://​doi​.org/​1​0​.​1​3​7​1​/​j​o​u​r​n​a​l​.​p​o​n​e​.​0​1​75799
25GARUTTI, G. (2023). Il faut voir comme on se parle. Mani­feste pour les arts de la parole. Actes Sud.

Support accurate information rooted in the scientific method.

Donate