Home / Chroniques / Can we develop our intuition to counter misinformation?
A chilling collage of human figures with retro TV heads, standing zombie-like, portraying censorship, disinformation, and the blind following of mass media.
π Digital π Society π Neuroscience

Can we develop our intuition to counter misinformation?

Patrice Georget
Patrice Georget
Lecturer in Psychosociology at the University School of Management IAE Caen
Key takeaways
  • Disinformation, the intentional production and distribution of fake news with the aim of causing harm, raises the question of trust in sources.
  • These practices create information chaos, threaten democratic life and reduce people's critical faculties in favour of dichotomous thinking.
  • Combating disinformation through legal regulation raises the question of the balance between freedom of expression and censorship.
  • To understand why and how disinformation spreads, we need to study the concept of epistemic beliefs.
  • To avoid falling into the trap, it is important to fight against one's intuition, to trust evidence more than one's own opinion and to look beyond one's socio-political ideologies.

Pro­pa­ganda is a glob­al strategy put in place by a state, an insti­tu­tion or a com­munity to destabil­ise a tar­get. Mis­in­form­a­tion is the unin­ten­tion­al shar­ing of fake news, erro­neous or obsol­ete inform­a­tion, through error, or lack of vigil­ance or know­ledge of the sub­ject: there is no actu­al inten­tion to cause harm here. Dis­in­form­a­tion, on the oth­er hand, is a pro­pa­ganda tool that works by delib­er­ately cre­at­ing and shar­ing false inform­a­tion with the inten­tion of caus­ing harm. This art­icle focuses on dis­in­form­a­tion because, in addi­tion to the ques­tion of the truth of inform­a­tion, this concept raises the issue of vera­city, and there­fore of trust in inform­a­tion sources. We will argue that com­bat­ing mis­in­form­a­tion means ask­ing three ques­tions about know­ledge: what to trust, how to trust and who to trust.

The breeding ground for disinformation: our society’s present-day vulnerabilities

The rise of dis­in­form­a­tion on social net­works is gen­er­at­ing inform­a­tion­al chaos that threatens demo­crat­ic life: sat­ur­a­tion of auto­mated advert­ising and har­ves­ted data, pro­mo­tion of shock­ing and con­spir­acy-ori­ented inform­a­tion, dis­cred­it­ing of author­ity fig­ures, algorithmic logic at the source of thought bubbles. “For example, 120,000 years’ worth of videos are viewed on You­Tube every day. Of these, 70% are watched because the platform’s arti­fi­cial intel­li­gence recom­mends them”1. Social net­works have also come to be seen as one of the most reli­able means of con­sult­ing the news2. The mis­in­form­a­tion of young people in par­tic­u­lar sends out wor­ry­ing sig­nals: one in 4 young French people sub­scribe to cre­ation­ist the­or­ies, 16% think that the earth could well be flat, 20% that the Amer­ic­ans have nev­er been to the moon, and 49% that astro­logy is a sci­ence. A large pro­por­tion of them believe that an influencer’s pop­ular­ity is a guar­an­tee of reli­ab­il­ity (rep­res­ent­at­ive sample aged between 18 and 24)3. Trust in sci­ence is strong and stable in all European coun­tries, except in France where it has fallen by 20 per­cent­age points in 18 months4. This drop in con­fid­ence in sci­ence is cor­rel­ated with sup­port for fake news and con­spir­acy the­or­ies5. At the same time, illec­tron­ism (a con­trac­tion of illit­er­acy and elec­tron­ics) is cre­at­ing a new area of exclu­sion, with 14 mil­lion French people exper­i­en­cing dif­fi­culties in using digit­al tools, at a time when dema­ter­i­al­isa­tion is becom­ing wide­spread6.

These vul­ner­ab­il­it­ies, com­bined with power­ful forces of influ­ence, have dam­aging effects on our demo­cra­cies: reduced crit­ic­al think­ing and credu­lity on the part of cit­izens, inab­il­ity to res­ist seduc­tion by, and sup­port for, dubi­ous ideas, select­ive expos­ure to inform­a­tion and pre­val­ence of hypo­thes­is-con­firm­a­tion bias, dicho­tom­ous think­ing and reduced abil­ity to make argu­ments7. Admit­tedly, these flaws are noth­ing new (cf. Orson Welles’ radio hoax “War of the Worlds”), but the infilt­ra­tion of supra-nation­al powers, the power of tech­no­lo­gic­al tools and the avail­ab­il­ity of our slum­ber­ing brains make this a crit­ic­al risk.

The levers for com­bat­ing dis­in­form­a­tion and mis­in­form­a­tion are there­fore a pri­or­ity for our demo­cra­cies. They fall into two dis­tinct cat­egor­ies: lim­it­ing the pro­duc­tion and dis­sem­in­a­tion of fake news, and lim­it­ing its impact.

Can we limit the production of misinformation: regulation and moderation?

350,000 mes­sages are pos­ted on X (formerly Twit­ter) every minute, for 250 mil­lion act­ive users. There are an estim­ated 2,000 mod­er­at­ors, i.e. one mod­er­at­or for every 175,000 users8. The same infla­tion is observed for oth­er social net­works. These fig­ures call into ques­tion the very pos­sib­il­ity of mod­er­at­ing inform­a­tion, which is increas­ingly man­aged by algorithms, a black box whose trans­par­ency is often ques­tioned9. Elon Musk, via his com­pany X, filed a suit against Cali­for­nia on 8 Septem­ber 2023, accus­ing the Amer­ic­an state of hinder­ing free­dom of expres­sion by for­cing plat­forms to be trans­par­ent about con­tent moderation.

To be a sci­ent­ist is to fight one’s brain

Leg­al reg­u­la­tion (ARCOM, DSA) is now being debated, and polit­ic­al insti­tu­tions are tak­ing up the issue, but the bal­ance between free­dom of expres­sion and cen­sor­ship has not yet been achieved. In France, the Autor­ité de Régu­la­tion de la Com­mu­nic­a­tion Audi­ovisuelle et Numérique (ARCOM) acts effect­ively but remains lim­ited in terms of resources, since it has 355 employ­ees work­ing on a wide range of issues (pro­tec­tion of audi­ences, media edu­ca­tion, respect for copy­right, inform­a­tion eth­ics, super­vi­sion of online plat­forms, devel­op­ments in radio and digit­al audio, VOD dis­tri­bu­tion). With the Digit­al Social Act, Europe is put­ting a sys­tem of account­ab­il­ity in place for the major plat­forms as from 2024, based on a simple prin­ciple: what is illeg­al off­line is illeg­al online. The aim is to pro­tect Inter­net users by a num­ber of prac­tic­al means: mak­ing the way in which the recom­mend­a­tion algorithm works access­ible to users, as well as the pos­sib­il­ity of deac­tiv­at­ing it, jus­ti­fy­ing mod­er­a­tion decisions, set­ting up an expli­cit mech­an­ism for report­ing con­tent, and allow­ing appeals. Cer­tain types of tar­geted advert­ising will be banned. Pen­al­ties for non-com­pli­ant plat­forms are set to match the stated ambi­tions: 6% of glob­al turnover.

The fact remains, how­ever, that if we take into account the vul­ner­ab­il­it­ies men­tioned above, the rap­id growth in the amount of inform­a­tion exchanged and the dif­fi­culties in reg­u­lat­ing and mod­er­at­ing plat­forms, a com­ple­ment­ary approach is needed: not just lim­it­ing mis­in­form­a­tion, but redu­cing its impact on its tar­gets by strength­en­ing their capa­city to res­ist. But how do we know wheth­er a piece of inform­a­tion is true?

How do we know we know something: epistemic beliefs

Epi­stem­ic beliefs relate to the ideas we have about know­ledge and the pro­cesses by which know­ledge is cre­ated: what makes us think we know things? What factors con­trib­ute to a mis­per­cep­tion of know­ledge? These ques­tions are cent­ral to under­stand­ing the spread and impact of mis­in­form­a­tion, as well as ways of coun­ter­ing it.

Kelly Gar­rett and Bri­an Weeks, from Ohio and Michigan Uni­ver­sit­ies, car­ried out a vast study in the United States in 2017 with the aim of gain­ing a bet­ter under­stand­ing of some of the determ­in­ing factors in adher­ence to mis­in­form­a­tion and con­spir­acy the­or­ies. Ini­tially, they meas­ured the opin­ions of par­ti­cipants on con­tro­ver­sial sub­jects in cer­tain con­spir­acy net­works: the fact that the Apollo Mis­sion nev­er went to the moon, that AIDS was an inten­tion­al cre­ation to harm the homo­sexu­al com­munity, that the 9/11 attacks were author­ised by the US admin­is­tra­tion to jus­ti­fy polit­ic­al decisions (mil­it­ary inva­sion and reduc­tion of civil rights), or that JFK, Luth­er King or Prin­cess Diana were assas­sin­ated on the orders of insti­tu­tions (gov­ern­ments or secret agen­cies). They also meas­ured par­ti­cipants’ opin­ions on highly sens­it­ive con­tem­por­ary social issues where there is a counter-dis­course to the cur­rent sci­entif­ic con­sensus: the role of human activ­ity in glob­al warm­ing, or the fact that cer­tain vac­cines cause dis­eases such as autism.

This data was cor­rel­ated with oth­er meas­ures of the same par­ti­cipants’ epi­stem­ic beliefs. The res­ults are indis­put­able: par­ti­cipants are more likely to sub­scribe to con­spir­acy the­or­ies and are more sus­pi­cious of sci­entif­ic dis­course the more:

  • they trust their intu­itions to “feel” the truth of things,
  • they believe that facts are not suf­fi­cient to call into ques­tion what they believe to be true,
  • they con­sider that all truth is rel­at­ive to a polit­ic­al context.

Since this study, a great deal of research has shown the extent to which these three ele­ments con­sti­tute vul­ner­ab­il­it­ies in the fight against dis­in­form­a­tion. In the next art­icle, we will explain how each of these epi­stem­ic beliefs works, in order to identi­fy the psychoso­cial skills we need to devel­op to sharpen our crit­ic­al think­ing skills.

What to trust: the intuition trap

The first import­ant res­ult of the study by Kelly Gar­rett and Bri­an Weeks con­cerns the trust placed in our intu­ition to under­stand the world around us, with the strong idea that cer­tain truths are not access­ible ration­ally. Instinct, first impres­sions and a dif­fuse “gut” feel­ing are said to be excel­lent indic­at­ors for guid­ing our judge­ments and decisions. This epi­stem­ic belief is widely held today in main­stream pub­lic­a­tions and per­son­al devel­op­ment meth­ods: “enter the magic of intu­ition”; “devel­op your 6th sense”; “man­age with intu­ition”; “the powers of intu­ition”: These titles sup­port the idea that there is a “little je ne sais quoi” that allows us to access hid­den truths and under­stand the world dir­ectly by “recon­nect­ing” with ourselves and our envir­on­ment (the cos­mos, pseudo quantum vibra­tions, etc.). Inspired by the New Age10, these approaches, which often relate to health and well-being, do not shy away from advoc­at­ing a return to com­mon sense and our abil­ity to know things in an emo­tion­al way, without the need for proof, thanks to a “gift”. Yet sci­ence has often developed con­trary to com­mon sense and first intu­itions: a heavy body does not fall faster than a light one, hot water freezes faster than cold water…

Admit­tedly, sci­entif­ic research does not ques­tion the role of intu­it­ive know­ledge, and numer­ous works and pub­lic­a­tions are devoted to it11, many of them in medi­cine under the aegis of “Gut Feel­ing”12. But what this cog­nit­ive sci­ence research says is very dif­fer­ent from what we find in per­son­al devel­op­ment books, primar­ily because intu­ition is described as a form of reas­on­ing that is part of a fairly ration­al pro­cess. In fact, sci­ent­ists have shown (with the help of empir­ic­al research car­ried out with pro­fes­sion­als who have developed intu­it­ive know­ledge, such as com­pany dir­ect­ors, doc­tors, fire­men, chess play­ers, sports­men and sol­diers) that intu­ition is all the more effect­ive in experts who have had a great deal of pre­vi­ous exper­i­ence, thanks to oppor­tun­it­ies to make hypo­theses based on the ana­lys­is of their envir­on­ment, to test them in a real situ­ation, to bene­fit from feed­back (suc­cess or fail­ure), to make cor­rec­tions, to retest, etc.… until they arrive at an impli­cit, effi­cient and rap­id know-how known as intu­ition. There’s noth­ing eso­ter­ic or “quantum” about it, but prac­tice, dis­cip­line, and feed­back13 enable us to make rap­id decisions when the con­text demands it. If 82% of Nobel Prize win­ners acknow­ledge that their dis­cov­er­ies were made thanks to their intu­ition14, it is above all because they have accu­mu­lated such a wealth of sci­entif­ic know­ledge and meth­od­o­lo­gic­al exper­i­ence that they end up aggreg­at­ing clusters of clues to arrive at an insight: “eureka”!

The first psychoso­cial skill to devel­op in the fight against mis­in­form­a­tion is there­fore to dis­trust one’s own intu­itions by res­ist­ing one­self15: “to be a sci­ent­ist is to fight one’s brain” said Gaston Bachelard. It’s not a ques­tion of sup­press­ing our intu­itions, but rather of tak­ing the neces­sary time to ques­tion them, audit them and val­id­ate their basis, and in this way to carry out meta­cog­nit­ive work on ourselves in a non-indul­gent and mod­est way: on what past exper­i­ence is my intu­ition based, have I had the oppor­tun­ity to have a lot of feed­back on the effects of my actions linked to this intu­ition, and to what extent am I not being influ­enced by my desires, my emo­tions or my envir­on­ment? This is all the more dif­fi­cult because an impres­sion is above all… impress­ive: what mat­ters most is not so much its con­tent as the men­tal pro­cess of its con­struc­tion and its con­sequences on the way we think and act16.

How to trust: the method

The second import­ant res­ult of Kelly Gar­rett and Bri­an Weeks’ study relates to the import­ance we attach to con­sist­ency between facts and opin­ions. Put anoth­er way, can we main­tain a belief in the face of a demon­stra­tion that con­tra­dicts it? Some of us need fac­tu­al evid­ence to form an opin­ion, dis­trust appear­ances and are con­cerned about the meth­od used to pro­duce data. Oth­ers not so much: the study men­tioned above shows that the lat­ter are much more likely to sub­scribe to false inform­a­tion and con­spir­acy the­or­ies. We remem­ber the “altern­at­ive facts” the day after Trump’s elec­tion, symp­to­mat­ic of the post-truth era. These strategies for dis­tort­ing real­ity are only pos­sible because they find an audi­ence who, while not being fooled by them, do not feel the need for con­sist­ency between facts and beliefs. On the con­trary, the coher­ence they seek tends to adjust the facts in favour of their beliefs, a ration­al­isa­tion effect that is well known from work on cog­nit­ive dis­son­ance. Hugo Mer­ci­er and Dan Sper­ber17have recently examined this issue in a book which defends the thes­is that our reas­on serves us above all… to be right, not only in rela­tion to oth­ers, but also in rela­tion to ourselves! Hence the cog­nit­ive biases with a self-jus­ti­fy­ing func­tion: hypo­thes­is con­firm­a­tion, anchor­ing, loss aver­sion, ret­ro­spect­ive bias, etc.18. It’s easy to see why com­bat­ing this is a daunt­ingly com­plex task, yet one that is both neces­sary and pos­sible if we make the effort to teach the sci­entif­ic meth­od and its com­pon­ents, and not just to stu­dents destined for sci­entif­ic careers! These altern­at­ive facts call into ques­tion the very notion of truth and know­ledge recog­nised as fair19, and lead to the sor­did con­clu­sion that sci­ence is an opin­ion like any oth­er20: this pos­ture under­mines the very found­a­tions of our demo­crat­ic insti­tu­tions, which is why know­ledge of the sci­entif­ic meth­od has now become a com­mon good and a genu­ine psy­cho-social skill accord­ing to the WHO: “abil­it­ies that enable the devel­op­ment not only of indi­vidu­al well-being, but also of con­struct­ive social interactions”

Who to trust: back to basics

The latest res­ult from Kelly Gar­rett and Bri­an Weeks’ study shows that the more indi­vidu­als believe that facts are depend­ent on the polit­ic­al power in place or the socio-polit­ic­al con­text in which they are pro­duced, the more read­ily they adhere to dis­in­form­a­tion and con­spir­acy the­or­ies. This type of epi­stem­ic belief, which is res­ol­utely relativ­ist­ic, is facil­it­ated by the fact that our beliefs also serve to rein­force our iden­ti­fic­a­tions with the groups to which we belong: we eval­u­ate the inform­a­tion to which we are exposed accord­ing to our socio-ideo­lo­gic­al prox­im­ity to its source. The under­ly­ing prob­lem here is there­fore one of vera­city rather than truth: it is a ques­tion of the mor­al qual­ity of the author of a piece of inform­a­tion and there­fore of the trust we place in him or her. Fran­cis Wolff21 shows that this relat­iv­ist stance is now a stum­bling block in the fight against the risks com­mon to human­ity as a whole (glob­al warm­ing, eco­nom­ic crisis, short­age of resources, extinc­tion of spe­cies, epi­dem­ics, ter­ror­ism, etc.) because of loc­al demands (iden­tity-based, com­munit­ari­an, nation­al­ist, xeno­phobic, reli­gious rad­ic­al­ism, etc.) that hamper our abil­ity to engage in dia­logue and find ways of mov­ing for­ward col­lect­ively. So what psy­cho-social skills do we need to devel­op if we are to know who we can trust and build com­mon pro­jects that tran­scend com­munit­ari­an divi­sions? To answer this ques­tion, Phil­ippe Bre­ton22 car­ried out a num­ber of empir­ic­al stud­ies dur­ing exper­i­ment­al argu­ment­a­tion work­shops. His res­ults sug­gest that we need to devel­op what he calls a “demo­crat­ic abil­ity”, which is cur­rently far too lack­ing to build trust, and which is based on three skills:

  • Speak­ing in front of oth­ers: prac­tising over­com­ing the fear of speak­ing in front of an unfa­mil­i­ar group. Sci­entif­ic research shows that this fear is one of the most wide­spread among adults (55%23). This fear hinders the very pos­sib­il­ity of estab­lish­ing the con­di­tions for cooperation.
  • Cog­nit­ive empathy: prac­tising defend­ing opin­ions con­trary to one’s own. The aim is to learn to identi­fy the qual­ity of the argu­ments and thus reg­u­late one’s less sol­id epi­stem­ic beliefs. This strategy is part of the psy­cho­lo­gic­al inocu­la­tion meth­ods24 designed to strengthen men­tal immunity.
  • Com­bat “con­sen­su­al palaver”: soft con­sensus is a way of avoid­ing debate that gives the illu­sion of bring­ing people togeth­er. Prac­tising “frank and peace­ful con­flic­tu­al­ity”25 is not easy, but it does provide the neces­sary demo­crat­ic vitality.

Conclusion

“Il faut voir comme on se parle. Mani­feste pour les arts de la parole” (We need to look at how we talk to each oth­er. A Mani­festo for the Arts of Speech) is the title of the latest book by Gérald Garutti, founder of the “Centre des Arts de la Parole”, a third-party centre that restores the psy­cho-social skills needed to build a space for shared dia­logue and com­bat the mis­in­form­a­tion that is under­min­ing our demo­cra­cies. These third-party centres, spaces for sci­ence and dis­cov­ery, and cit­izens’ labor­at­or­ies for exper­i­ment­a­tion, share the com­mon goal of devel­op­ing demo­crat­ic skills in the form of oper­a­tion­al know-how: know­ing how to argue and make a coun­ter­ar­gu­ment, know­ing how to listen, sus­pend­ing one’s judge­ment and eli­cit­ing that of oth­ers. They also help us to under­stand how sci­entif­ic truth is con­struc­ted and how this know­ledge can be biased: these are the levers of free will and liv­ing together.

1BRONNER Gérald, (2022). Les lumières à l’ère numérique, Presses Uni­versitaires de France.
2WATSON Any, (2021). Share of adults who trust selec­ted news sources world­wide in 2018, by region. Statista. https://​www​.statista​.com/​s​t​a​t​i​s​t​i​c​s​/​9​6​7​3​5​6​/​n​e​w​s​-​s​o​u​r​c​e​s​-​t​r​u​s​t​w​o​r​t​h​i​n​e​s​s​-​w​o​r​l​d​wide/
3KRAUS François, LEE BOUYGUES Helen, REICHSTADT Rudy, (2023). La mésin­form­a­tion sci­en­ti­fique des jeunes à l’heure des réseaux soci­aux. Fond­a­tion Jean Jaurès, pub­lic­a­tion du 12 jan­vi­er 2023. https://​www​.jean​-jaures​.org/​p​u​b​l​i​c​a​t​i​o​n​/​l​a​-​m​e​s​i​n​f​o​r​m​a​t​i​o​n​-​s​c​i​e​n​t​i​f​i​q​u​e​-​d​e​s​-​j​e​u​n​e​s​-​a​-​l​h​e​u​r​e​-​d​e​s​-​r​e​s​e​a​u​x​-​s​o​c​iaux/
4ALGAN Yann, COHEN Daniel, DAVOINE Eva, FOUCAULT Mar­tial et STANTCHEVA Stefanie, (2021). Con­fi­ance dans les sci­en­ti­fiques par temps de crise. Con­seil d’analyse économique, n°068‑2021, 8 pages.
5GARETT R.K. & WEEKS, B.E. (2017). Epi­stem­ic beliefs’ role in pro­mot­ing mis­per­cep­tions and con­spir­acist ideation. Weeks BE, PLOS ONE 12(9): e0184733. https://​doi​.org/​1​0​.​1​3​7​1​/​j​o​u​r​n​a​l​.​p​o​n​e​.​0​1​84733
6Bercy Numérique, (2023). L’il­lec­tron­isme : frac­ture numérique et frac­ture sociale ? https://​www​.ber​cy​nu​merique​.fin​ances​.gouv​.fr/​l​i​l​l​e​c​t​r​o​n​i​s​m​e​-​f​r​a​c​t​u​r​e​-​n​u​m​e​r​i​q​u​e​-​e​t​-​f​r​a​c​t​u​r​e​-​s​o​ciale
7BRONNER Gérald, (2021). Apo­ca­lypse cog­nit­ive. Presses uni­versitaires de France.
8Digi­mind (2023) https://​blog​.digi​mind​.com/​f​r​/​t​e​n​d​a​n​c​e​s​/​t​w​i​t​t​e​r​-​c​h​i​f​f​r​e​s​-​e​s​s​e​n​t​i​e​l​s​-​f​r​a​n​c​e​-​m​o​n​d​e​-2020
9Les Echos, 24 avril 2023, Les algorithmes des réseaux soci­aux restent une boîte noire.  https://​www​.lesechos​.fr/​t​e​c​h​-​m​e​d​i​a​s​/​h​i​g​h​t​e​c​h​/​l​e​s​-​a​l​g​o​r​i​t​h​m​e​s​-​d​e​s​-​r​e​s​e​a​u​x​-​s​o​c​i​a​u​x​-​r​e​s​t​e​n​t​-​u​n​e​-​b​o​i​t​e​-​n​o​i​r​e​-​1​9​36126
10MARQUIS, N. (2017). Les impasses du dévelop­pe­ment per­son­nel: L’obsession de la quête de soi. Revue du Crieur, 7, 38–53. https://​doi​.org/​1​0​.​3​9​1​7​/​c​r​i​e​u​.​0​0​7​.0038
11GIGERENZER, G. (2007). Le génie de l’intuition. Par­is, Pock­et.
12LECOINTRE, C. (2020). Intu­ition : génie ou folie ? réflex­ions autour de l’usage et de la légit­im­ité de l’intuition dans le soin en pédi­atrie. Erès, Revue Française d’Ethique Appli­quée, 1–9, 129–143. Et PERNIN, T., BOURRILLON, A. STOLPER, E. & BAUMANN, L. (2017). Vers un con­sensus sur le Gut Feel­ing aux urgences pédi­at­riques françaises. Méde­cine, Mai 2017, pp. 221–227.
13KAHNEMAN, D., SIBONY, O. & SUNSTEIN, C.R. (2021). NOISE. Pour­quoi nous fais­ons des erreurs de juge­ment et com­ment les éviter. Odile jac­ob.
14SENDER, E. (2016). Intu­ition : le cerveau en roue libre. Sci­ences et Avenir, 827, Jan­vi­er.
15HOUDE, O. (2022). Appren­dre à rés­ister. Pour com­battre les biais cog­ni­tifs. Par­is, Flam­mari­on.
16LE POULTIER, F. (2020). Com­ment nav­iguer dans les eaux troubles d’un océan d’absurdités. Rennes, Presses Uni­versitaires de Rennes
17MERCIER, H. & SPERBER, D. (2021). L’énigme de la rais­on. Par­is, Odile Jac­ob.
18GEORGET, P. (2021). Les biais cog­ni­tifs sont-ils com­pat­ibles avec la méthode sci­en­ti­fique ? Poly­tech­nique Insights, 6 juil­let 2021 : https://​www​.poly​tech​nique​-insights​.com/​t​r​i​b​u​n​e​s​/​s​o​c​i​e​t​e​/​l​e​s​-​b​i​a​i​s​-​c​o​g​n​i​t​i​f​s​-​s​o​n​t​-​i​l​s​-​c​o​m​p​a​t​i​b​l​e​s​-​a​v​e​c​-​l​a​-​m​e​t​h​o​d​e​-​s​c​i​e​n​t​i​f​ique/
19ESQUERRE, A. (2018). Le ver­tige des faits altern­atifs. Qu’est dev­en­ue la vérité ? Par­is, Edi­tion Tex­tuel
20KLEIN, E. (2020). Le gout du vrai. Tracts Gal­li­mard.
21WOLFF, F. (2019). Plaidoy­er pour l’universel. Par­is, Fayard.
22BRETON, P. (2006). L’incompétence démo­cratique. Édi­tions La Découverte.
23ANDRE, C., LEGERON, P. & PELISSOLO, A. (2023). La nou­velle peur des autres. Par­is, Odile Jac­ob.
24COOK J, LEWANDOWSKY S, ECKER UKH (2017) Neut­ral­iz­ing mis­in­form­a­tion through inocu­la­tion: Expos­ing mis­lead­ing argu­ment­a­tion tech­niques reduces their influ­ence. PLOS ONE 12(5): e0175799. https://​doi​.org/​1​0​.​1​3​7​1​/​j​o​u​r​n​a​l​.​p​o​n​e​.​0​1​75799
25GARUTTI, G. (2023). Il faut voir comme on se parle. Mani­feste pour les arts de la parole. Act­es Sud.

Support accurate information rooted in the scientific method.

Donate