Home / Chroniques / Are cognitive biases compatible with the scientific method?
tribune_PatriceGeorget_EN
π Society π Neuroscience

Are cognitive biases compatible with the scientific method?

Patrice Georget
Patrice Georget
Lecturer in Psychosociology at the University School of Management IAE Caen

Cog­nit­ive biases are well doc­u­mented in cog­nit­ive sci­ence research1. These sys­tem­at­ic – and there­fore pre­dict­able – errors are not only a sign of our lim­ited ration­al­ity, but they also explain the way our judg­ments and decisions work. As such, cog­nit­ive biases are a group of inher­ent pro­cesses con­trolling the mind so that it can: man­age large flows of inform­a­tion, com­pensate for the lim­its of memory, pre­serve cog­nit­ive eco­nomy, make quick decisions, access mean­ing­ful explan­a­tions, pro­tect integ­rity of the self and reas­sure ourselves in our decisions.

Study­ing cog­nit­ive biases in a sci­entif­ic way con­sists of build­ing up a ration­al body of know­ledge to bet­ter under­stand our irra­tion­al­ity. To do this, the sci­entif­ic meth­od relies on the one hand on the descrip­tion of objec­ti­fi­able facts, which we identi­fy using quant­it­ive meth­ods. And on the oth­er, by using invari­ant explan­at­ory mod­els (a.k.a. the­or­ies) – which must cor­res­pond with known facts and be sub­sequently used to pre­dict, test, and com­pare – through which we seek to under­stand the caus­al­ity of phe­nom­ena made pos­sible by an exper­i­ment­al meth­od2.

Since the truth is not always easy to find, sci­ence is full of con­tro­ver­sies. This is why the sci­entif­ic meth­od is based on the fun­da­ment­al prin­ciple of “dis­pute”, i.e. debate of the res­ults obtained, between peers, with demon­strated pub­licly-avail­able proof. It is there­fore col­lect­ive, sub­ject to cri­ti­cism and rep­lic­a­tion, with nuance, over a long peri­od of time and inde­pend­ent from polit­ic­al influ­ence so that we may con­verge towards truth.

How­ever, it should be said that the ingredi­ents of the sci­entif­ic meth­od and cog­nit­ive biases are some­times (or even, often) ant­ag­on­ist­ic. Without claim­ing to be exhaust­ive, let’s identi­fy some sig­ni­fic­ant stum­bling blocks that may help us to bet­ter under­stand cer­tain con­tem­por­ary issues around mis­trust in science.

Confirm vs. deny

Ima­gine that I have a rule in mind that I sug­gest you guess. I inform you that the sequence of num­bers “2, 4 and 6” respects this rule. To guess it, you can pro­pose oth­er sequences of three num­bers, and I will tell you if they con­form to my rule or not. When we carry out this exper­i­ment3, the par­ti­cipants will logic­ally make a hypo­thes­is about the rule (for example “sequence of num­bers increas­ing by two each time”), and test it pos­it­ively, with a large major­ity of jus­ti­fic­a­tion series such as “16, 18, 20” and then “23, 25, 27”.

The pur­pose of these con­firm­at­ory state­ments is not to test IF the hypo­thes­is is true, but THAT the hypo­thes­is is true. Only the series that will inval­id­ate the hypo­thes­is for­mu­lated by the par­ti­cipants (e.g. here “3, 6, 9”) will make it pos­sible to veri­fy IF it is true. This “hypo­thes­is con­firm­a­tion bias” explains why we spon­tan­eously and care­fully avoid look­ing for argu­ments that go against our beliefs: the aver­sion to los­ing our cer­tain­ties out­weighs the pos­sib­il­ity of gain­ing new know­ledge. As someone once said, “Insan­ity is doing the same thing over and over again and expect­ing a dif­fer­ent result”.

We tend to over­es­tim­ate the prob­ab­il­ity of an event when we know that it has taken place.

The sci­entif­ic meth­od, on the oth­er hand, is counter-intu­it­ive, and teaches us to beware of this bias thanks to the double-blind tech­nique designed to lim­it self-per­sua­sion, and to an “infirm­at­ory” pos­ture: test­ing hypo­theses by mul­tiply­ing exper­i­ments likely to refute them. Hence, a the­ory “res­ists” the facts until proven oth­er­wise. Nev­er­the­less, the pro­cess of research is not entirely free from con­firm­a­tion bias because pos­it­ive res­ults are con­sid­er­ably well-val­ued by pub­lic­a­tions, espe­cially in the so-called “social sci­ences”. Moreover, repro­du­cib­il­ity stud­ies are not always pop­u­lar, espe­cially when they reveal how many research res­ults in the human­it­ies and social sci­ences can­not be reproduced[/pi_note]Larivée, S., Sénéchal, C., St-Onge, Z. & Sauvé, M.-R. (2019). « Le biais de con­firm­a­tion en recher­che ». Revue de psy­choé­du­ca­tion, 48(1), 245–263[/pi_note].

The power of hypo­thes­is con­firm­a­tion bias lies in the fact that it does not only con­cern the present but also… the past! Indeed, we tend to over­es­tim­ate the prob­ab­il­ity of an event when we know that it has taken place: after the fact, we often behave as if the future were obvi­ous to pre­dict (“that was bound to hap­pen”), and as if uncer­tainty or the unknown did not inter­vene in the events. This “ret­ro­spect­ive” con­firm­a­tion bias4 is all the more sali­ent in tra­gic situ­ations and may explain cri­ti­cism of sci­ent­ists’ or politi­cians’ inten­tions once the human toll of a pan­dem­ic, a ter­ror­ist attack or an eco­nom­ic crisis is known.

The ret­ro­spect­ive bias relies on the extraordin­ary capa­city of the human mind for ration­al­isa­tion, i.e. the jus­ti­fic­a­tion of events after the fact. We can nev­er res­ist telling ourselves a good story, even if it means dis­tort­ing real­ity5. As a res­ult, the frantic search for causes is pre­ferred to simple cor­rel­a­tions, pseudo-cer­tain­ties to prob­ab­il­it­ies, the deni­al of chance to the con­sid­er­a­tion of haz­ards, dicho­tom­ous think­ing to nuance, the over­es­tim­a­tion of low prob­ab­il­it­ies to the neut­ral obser­va­tion of facts: pre­cisely the oppos­ite of what the sci­entif­ic meth­od teaches us.

Hard science vs. Humanities

Can the sci­entif­ic meth­od be applied to the study of humans by humans? In a vast series of research stud­ies in exper­i­ment­al social psy­cho­logy, Jean-Pierre Deconchy and his team explored a fas­cin­at­ing sub­ject: the way human­ity thinks about human­ity, and the way human­ity thinks about the study of human­ity. With the help of ingeni­ous exper­i­ment­al set-ups col­lec­ted in the year 2000 (pub­lished in Les animaux surnaturés6), research­ers showed how, in the absence advanced sci­entif­ic cul­ture, some of our cog­nit­ive fil­ters con­vince ourselves that our thoughts and beha­viours are not based on nat­ur­al determ­in­ants. And that, con­sequently, by vir­tue of these cog­nit­ive fil­ters, sci­ence would be unfit to under­stand and explain deep human “nature”.

Thus, humans con­struct a defin­i­tion of human­ity, which sep­ar­ates them­selves from the idea that they are creatures of nature, determ­ined by the same laws as oth­er liv­ing beings. And that behind this bio­lo­gic­al form hides anoth­er “thing”, a “super-nature”, and thus a defi­ance of the very idea that sci­ence has a word to say on what human­ity is.

In this research, we find the idea of lim­ited ration­al­ity, in the sense that the know­ledge of human­ity would be some­thing oth­er than ration­al­ity. It is also incred­ible to see that, at the same time as we pro­gress in cog­nit­ive and neur­os­ciences, we are also wit­ness­ing sev­er­al pseudo-human sci­ences flour­ishes, adding a little extra soul to the “super-nature” stud­ied by Deconchy. These include a reviv­al of sham­an­ism, ener­get­ic ‘medi­cine’ and per­son­al devel­op­ment tech­niques. They adopt sci­entif­ic vocab­u­lary that has an author­it­at­ive (anoth­er cog­nit­ive bias) effect – some­thing that we have recently seen in fanci­ful extra­pol­a­tions claim­ing terms from quantum phys­ics to jus­ti­fy altern­at­ive medi­cines or oth­er mys­ter­i­ous phe­nom­ena7.

Thinking against oneself

Our brain draws quick and cheap con­clu­sions to do us a favour. Most of the time, they are suf­fi­cient and roughly rel­ev­ant to our imme­di­ate needs. But some­times, they do us a dis­ser­vice and lead us down a path that dis­cred­its the very idea of free will. To fight against one­self, against the nat­ur­al slope of cog­nit­ive biases that weak­en our dis­cern­ment, requires min­im­al train­ing in what sci­entif­ic meth­od is – not only for those who are destined to a sci­entif­ic pro­fes­sion. It also requires an under­stand­ing of the short­cuts our brain uses to make our lives easi­er, and some­times to lull us into an illu­sion of understanding.

Char­it­ies such as “La main à la pâte” (in France) and, more glob­ally, the pro­jects ded­ic­ated to sci­entif­ic out­reach, in con­nec­tion with uni­ver­sit­ies and research organ­isa­tions, are feed­ing a real soci­et­al need to rein­force psy­cho-social skills, not only of school­chil­dren but also of cit­izens. This is the price to pay so that sci­ence is not per­ceived as a belief like any oth­er, so that doubt­ful or mis­lead­ing opin­ions do not take pre­ced­ence over the truth, and thus so that our demo­cra­cies main­tain their eman­cip­at­ory skills.

1Vin­cent Ber­thet (2018), L’erreur est humaine. Aux frontières de la ration­al­ité. Par­is, CNRS Edi­tions
2Ben­jamin Matalon (1997). Décri­re, expli­quer, pré­voir. Colin
3Klay­man, Joshua & Ha, Young-won (1987). “Con­firm­a­tion, dis­con­firm­a­tion, and inform­a­tion in hypo­thes­is test­ing”. Psy­cho­lo­gic­al Review, 94 (2):211–228
4J. Bar­on &  J. Her­shey (1988), “Out­come bias in decision eval­u­ation”, Journ­al of Per­son­al­ity and Social Psy­cho­logy 54(4), pp 569–579
5Lionel Nac­cache (2020). Le cinéma intérieur. Pro­jec­tion privée au cœur de la con­science. Odile Jac­ob
6Jean-Pierre Deconchy (2000). Les animaux surnaturés. Presses Uni­versitaires de Gren­oble
7(8) Juli­en Bobroff (2019). « Sept idées fausses sur la physique quantique ». The Con­ver­sa­tion, https://​thecon​ver​sa​tion​.com/​s​e​p​t​-​i​d​e​e​s​-​f​a​u​s​s​e​s​-​s​u​r​-​l​a​-​p​h​y​s​i​q​u​e​-​q​u​a​n​t​i​q​u​e​-​1​13517

Contributors

Patrice Georget

Patrice Georget

Lecturer in Psychosociology at the University School of Management IAE Caen

Patrice Georget is a lecturer and researcher in psycho-sociology at the IAE Caen University school of management, which he directed from 2015 to 2020. He has been an industry consultant in diversity management and risk prevention. He has been an expert for the APM (Association Progrès du Management) since 2009 and a GERME speaker.

Support accurate information rooted in the scientific method.

Donate