Home / Chroniques / Are cognitive biases compatible with the scientific method?
tribune_PatriceGeorget_EN
π Society

Are cognitive biases compatible with the scientific method?

Patrice Georget
Patrice Georget
Lecturer in Psychosociology at the University School of Management IAE Caen

Cog­ni­tive bias­es are well doc­u­ment­ed in cog­ni­tive sci­ence research1. These sys­tem­at­ic – and there­fore pre­dictable – errors are not only a sign of our lim­it­ed ratio­nal­i­ty, but they also explain the way our judg­ments and deci­sions work. As such, cog­ni­tive bias­es are a group of inher­ent process­es con­trol­ling the mind so that it can: man­age large flows of infor­ma­tion, com­pen­sate for the lim­its of mem­o­ry, pre­serve cog­ni­tive econ­o­my, make quick deci­sions, access mean­ing­ful expla­na­tions, pro­tect integri­ty of the self and reas­sure our­selves in our decisions.

Study­ing cog­ni­tive bias­es in a sci­en­tif­ic way con­sists of build­ing up a ratio­nal body of knowl­edge to bet­ter under­stand our irra­tional­i­ty. To do this, the sci­en­tif­ic method relies on the one hand on the descrip­tion of objec­ti­fi­able facts, which we iden­ti­fy using quan­ti­tive meth­ods. And on the oth­er, by using invari­ant explana­to­ry mod­els (a.k.a. the­o­ries) – which must cor­re­spond with known facts and be sub­se­quent­ly used to pre­dict, test, and com­pare – through which we seek to under­stand the causal­i­ty of phe­nom­e­na made pos­si­ble by an exper­i­men­tal method2.

Since the truth is not always easy to find, sci­ence is full of con­tro­ver­sies. This is why the sci­en­tif­ic method is based on the fun­da­men­tal prin­ci­ple of “dis­pute”, i.e. debate of the results obtained, between peers, with demon­strat­ed pub­licly-avail­able proof. It is there­fore col­lec­tive, sub­ject to crit­i­cism and repli­ca­tion, with nuance, over a long peri­od of time and inde­pen­dent from polit­i­cal influ­ence so that we may con­verge towards truth.

How­ev­er, it should be said that the ingre­di­ents of the sci­en­tif­ic method and cog­ni­tive bias­es are some­times (or even, often) antag­o­nis­tic. With­out claim­ing to be exhaus­tive, let’s iden­ti­fy some sig­nif­i­cant stum­bling blocks that may help us to bet­ter under­stand cer­tain con­tem­po­rary issues around mis­trust in science.

Confirm vs. deny

Imag­ine that I have a rule in mind that I sug­gest you guess. I inform you that the sequence of num­bers “2, 4 and 6” respects this rule. To guess it, you can pro­pose oth­er sequences of three num­bers, and I will tell you if they con­form to my rule or not. When we car­ry out this exper­i­ment3, the par­tic­i­pants will log­i­cal­ly make a hypoth­e­sis about the rule (for exam­ple “sequence of num­bers increas­ing by two each time”), and test it pos­i­tive­ly, with a large major­i­ty of jus­ti­fi­ca­tion series such as “16, 18, 20” and then “23, 25, 27”.

The pur­pose of these con­fir­ma­to­ry state­ments is not to test IF the hypoth­e­sis is true, but THAT the hypoth­e­sis is true. Only the series that will inval­i­date the hypoth­e­sis for­mu­lat­ed by the par­tic­i­pants (e.g. here “3, 6, 9”) will make it pos­si­ble to ver­i­fy IF it is true. This “hypoth­e­sis con­fir­ma­tion bias” explains why we spon­ta­neous­ly and care­ful­ly avoid look­ing for argu­ments that go against our beliefs: the aver­sion to los­ing our cer­tain­ties out­weighs the pos­si­bil­i­ty of gain­ing new knowl­edge. As some­one once said, “Insan­i­ty is doing the same thing over and over again and expect­ing a dif­fer­ent result”.

We tend to over­es­ti­mate the prob­a­bil­i­ty of an event when we know that it has tak­en place.

The sci­en­tif­ic method, on the oth­er hand, is counter-intu­itive, and teach­es us to beware of this bias thanks to the dou­ble-blind tech­nique designed to lim­it self-per­sua­sion, and to an “infir­ma­to­ry” pos­ture: test­ing hypothe­ses by mul­ti­ply­ing exper­i­ments like­ly to refute them. Hence, a the­o­ry “resists” the facts until proven oth­er­wise. Nev­er­the­less, the process of research is not entire­ly free from con­fir­ma­tion bias because pos­i­tive results are con­sid­er­ably well-val­ued by pub­li­ca­tions, espe­cial­ly in the so-called “social sci­ences”. More­over, repro­ducibil­i­ty stud­ies are not always pop­u­lar, espe­cial­ly when they reveal how many research results in the human­i­ties and social sci­ences can­not be reproduced[/pi_note]Larivée, S., Sénéchal, C., St-Onge, Z. & Sauvé, M.-R. (2019). « Le biais de con­fir­ma­tion en recherche ». Revue de psy­choé­d­u­ca­tion, 48(1), 245–263[/pi_note].

The pow­er of hypoth­e­sis con­fir­ma­tion bias lies in the fact that it does not only con­cern the present but also… the past! Indeed, we tend to over­es­ti­mate the prob­a­bil­i­ty of an event when we know that it has tak­en place: after the fact, we often behave as if the future were obvi­ous to pre­dict (“that was bound to hap­pen”), and as if uncer­tain­ty or the unknown did not inter­vene in the events. This “ret­ro­spec­tive” con­fir­ma­tion bias4 is all the more salient in trag­ic sit­u­a­tions and may explain crit­i­cism of sci­en­tists’ or politi­cians’ inten­tions once the human toll of a pan­dem­ic, a ter­ror­ist attack or an eco­nom­ic cri­sis is known.

The ret­ro­spec­tive bias relies on the extra­or­di­nary capac­i­ty of the human mind for ratio­nal­i­sa­tion, i.e. the jus­ti­fi­ca­tion of events after the fact. We can nev­er resist telling our­selves a good sto­ry, even if it means dis­tort­ing real­i­ty5. As a result, the fran­tic search for caus­es is pre­ferred to sim­ple cor­re­la­tions, pseu­do-cer­tain­ties to prob­a­bil­i­ties, the denial of chance to the con­sid­er­a­tion of haz­ards, dichoto­mous think­ing to nuance, the over­es­ti­ma­tion of low prob­a­bil­i­ties to the neu­tral obser­va­tion of facts: pre­cise­ly the oppo­site of what the sci­en­tif­ic method teach­es us.

Hard science vs. Humanities

Can the sci­en­tif­ic method be applied to the study of humans by humans? In a vast series of research stud­ies in exper­i­men­tal social psy­chol­o­gy, Jean-Pierre Deconchy and his team explored a fas­ci­nat­ing sub­ject: the way human­i­ty thinks about human­i­ty, and the way human­i­ty thinks about the study of human­i­ty. With the help of inge­nious exper­i­men­tal set-ups col­lect­ed in the year 2000 (pub­lished in Les ani­maux sur­naturés6), researchers showed how, in the absence advanced sci­en­tif­ic cul­ture, some of our cog­ni­tive fil­ters con­vince our­selves that our thoughts and behav­iours are not based on nat­ur­al deter­mi­nants. And that, con­se­quent­ly, by virtue of these cog­ni­tive fil­ters, sci­ence would be unfit to under­stand and explain deep human “nature”.

Thus, humans con­struct a def­i­n­i­tion of human­i­ty, which sep­a­rates them­selves from the idea that they are crea­tures of nature, deter­mined by the same laws as oth­er liv­ing beings. And that behind this bio­log­i­cal form hides anoth­er “thing”, a “super-nature”, and thus a defi­ance of the very idea that sci­ence has a word to say on what human­i­ty is.

In this research, we find the idea of lim­it­ed ratio­nal­i­ty, in the sense that the knowl­edge of human­i­ty would be some­thing oth­er than ratio­nal­i­ty. It is also incred­i­ble to see that, at the same time as we progress in cog­ni­tive and neu­ro­sciences, we are also wit­ness­ing sev­er­al pseu­do-human sci­ences flour­ish­es, adding a lit­tle extra soul to the “super-nature” stud­ied by Deconchy. These include a revival of shaman­ism, ener­getic ‘med­i­cine’ and per­son­al devel­op­ment tech­niques. They adopt sci­en­tif­ic vocab­u­lary that has an author­i­ta­tive (anoth­er cog­ni­tive bias) effect – some­thing that we have recent­ly seen in fan­ci­ful extrap­o­la­tions claim­ing terms from quan­tum physics to jus­ti­fy alter­na­tive med­i­cines or oth­er mys­te­ri­ous phe­nom­e­na7.

Thinking against oneself

Our brain draws quick and cheap con­clu­sions to do us a favour. Most of the time, they are suf­fi­cient and rough­ly rel­e­vant to our imme­di­ate needs. But some­times, they do us a dis­ser­vice and lead us down a path that dis­cred­its the very idea of free will. To fight against one­self, against the nat­ur­al slope of cog­ni­tive bias­es that weak­en our dis­cern­ment, requires min­i­mal train­ing in what sci­en­tif­ic method is – not only for those who are des­tined to a sci­en­tif­ic pro­fes­sion. It also requires an under­stand­ing of the short­cuts our brain uses to make our lives eas­i­er, and some­times to lull us into an illu­sion of understanding.

Char­i­ties such as “La main à la pâte” (in France) and, more glob­al­ly, the projects ded­i­cat­ed to sci­en­tif­ic out­reach, in con­nec­tion with uni­ver­si­ties and research organ­i­sa­tions, are feed­ing a real soci­etal need to rein­force psy­cho-social skills, not only of school­child­ren but also of cit­i­zens. This is the price to pay so that sci­ence is not per­ceived as a belief like any oth­er, so that doubt­ful or mis­lead­ing opin­ions do not take prece­dence over the truth, and thus so that our democ­ra­cies main­tain their eman­ci­pa­to­ry skills.

1Vin­cent Berthet (2018), L’erreur est humaine. Aux fron­tières de la ratio­nal­ité. Paris, CNRS Edi­tions
2Ben­jamin Mat­alon (1997). Décrire, expli­quer, prévoir. Col­in
3Klay­man, Joshua & Ha, Young-won (1987). “Con­fir­ma­tion, dis­con­fir­ma­tion, and infor­ma­tion in hypoth­e­sis test­ing”. Psy­cho­log­i­cal Review, 94 (2):211–228
4J. Baron &  J. Her­shey (1988), “Out­come bias in deci­sion eval­u­a­tion”, Jour­nal of Per­son­al­i­ty and Social Psy­chol­o­gy 54(4), pp 569–579
5Lionel Nac­cache (2020). Le ciné­ma intérieur. Pro­jec­tion privée au cœur de la con­science. Odile Jacob
6Jean-Pierre Deconchy (2000). Les ani­maux sur­naturés. Press­es Uni­ver­si­taires de Greno­ble
7(8) Julien Bobroff (2019). « Sept idées fauss­es sur la physique quan­tique ». The Con­ver­sa­tion, https://​the​con​ver​sa​tion​.com/​s​e​p​t​-​i​d​e​e​s​-​f​a​u​s​s​e​s​-​s​u​r​-​l​a​-​p​h​y​s​i​q​u​e​-​q​u​a​n​t​i​q​u​e​-​1​13517

Contributors

Patrice Georget

Patrice Georget

Lecturer in Psychosociology at the University School of Management IAE Caen

Patrice Georget is a lecturer and researcher in psycho-sociology at the IAE Caen University school of management, which he directed from 2015 to 2020. He has been an industry consultant in diversity management and risk prevention. He has been an expert for the APM (Association Progrès du Management) since 2009 and a GERME speaker.

Our world explained with science. Every week, in your inbox.

Get the newsletter