Home / Chroniques / Are cognitive biases compatible with the scientific method?
tribune_PatriceGeorget_EN
π Society π Neuroscience

Are cognitive biases compatible with the scientific method ?

Patrice Georget
Patrice Georget
Lecturer in Psychosociology at the University School of Management IAE Caen

Cog­ni­tive biases are well docu­men­ted in cog­ni­tive science research1. These sys­te­ma­tic – and the­re­fore pre­dic­table – errors are not only a sign of our limi­ted ratio­na­li­ty, but they also explain the way our judg­ments and deci­sions work. As such, cog­ni­tive biases are a group of inherent pro­cesses control­ling the mind so that it can : manage large flows of infor­ma­tion, com­pen­sate for the limits of memo­ry, pre­serve cog­ni­tive eco­no­my, make quick deci­sions, access mea­ning­ful expla­na­tions, pro­tect inte­gri­ty of the self and reas­sure our­selves in our decisions.

Stu­dying cog­ni­tive biases in a scien­ti­fic way consists of buil­ding up a ratio­nal body of know­ledge to bet­ter unders­tand our irra­tio­na­li­ty. To do this, the scien­ti­fic method relies on the one hand on the des­crip­tion of objec­ti­fiable facts, which we iden­ti­fy using quan­ti­tive methods. And on the other, by using inva­riant expla­na­to­ry models (a.k.a. theo­ries) – which must cor­res­pond with known facts and be sub­se­quent­ly used to pre­dict, test, and com­pare – through which we seek to unders­tand the cau­sa­li­ty of phe­no­me­na made pos­sible by an expe­ri­men­tal method2.

Since the truth is not always easy to find, science is full of contro­ver­sies. This is why the scien­ti­fic method is based on the fun­da­men­tal prin­ciple of “dis­pute”, i.e. debate of the results obtai­ned, bet­ween peers, with demons­tra­ted publi­cly-avai­lable proof. It is the­re­fore col­lec­tive, sub­ject to cri­ti­cism and repli­ca­tion, with nuance, over a long per­iod of time and inde­pendent from poli­ti­cal influence so that we may converge towards truth.

Howe­ver, it should be said that the ingre­dients of the scien­ti­fic method and cog­ni­tive biases are some­times (or even, often) anta­go­nis­tic. Without clai­ming to be exhaus­tive, let’s iden­ti­fy some signi­fi­cant stum­bling blocks that may help us to bet­ter unders­tand cer­tain contem­po­ra­ry issues around mis­trust in science.

Confirm vs. deny

Ima­gine that I have a rule in mind that I sug­gest you guess. I inform you that the sequence of num­bers “2, 4 and 6” res­pects this rule. To guess it, you can pro­pose other sequences of three num­bers, and I will tell you if they conform to my rule or not. When we car­ry out this expe­riment3, the par­ti­ci­pants will logi­cal­ly make a hypo­the­sis about the rule (for example “sequence of num­bers increa­sing by two each time”), and test it posi­ti­ve­ly, with a large majo­ri­ty of jus­ti­fi­ca­tion series such as “16, 18, 20” and then “23, 25, 27”.

The pur­pose of these confir­ma­to­ry sta­te­ments is not to test IF the hypo­the­sis is true, but THAT the hypo­the­sis is true. Only the series that will inva­li­date the hypo­the­sis for­mu­la­ted by the par­ti­ci­pants (e.g. here “3, 6, 9”) will make it pos­sible to veri­fy IF it is true. This “hypo­the­sis confir­ma­tion bias” explains why we spon­ta­neous­ly and care­ful­ly avoid loo­king for argu­ments that go against our beliefs : the aver­sion to losing our cer­tain­ties out­weighs the pos­si­bi­li­ty of gai­ning new know­ledge. As someone once said, “Insa­ni­ty is doing the same thing over and over again and expec­ting a dif­ferent result”.

We tend to ove­res­ti­mate the pro­ba­bi­li­ty of an event when we know that it has taken place.

The scien­ti­fic method, on the other hand, is coun­ter-intui­tive, and teaches us to beware of this bias thanks to the double-blind tech­nique desi­gned to limit self-per­sua­sion, and to an “infir­ma­to­ry” pos­ture : tes­ting hypo­theses by mul­ti­plying expe­ri­ments like­ly to refute them. Hence, a theo­ry “resists” the facts until pro­ven other­wise. Never­the­less, the pro­cess of research is not enti­re­ly free from confir­ma­tion bias because posi­tive results are consi­de­ra­bly well-valued by publi­ca­tions, espe­cial­ly in the so-cal­led “social sciences”. Moreo­ver, repro­du­ci­bi­li­ty stu­dies are not always popu­lar, espe­cial­ly when they reveal how many research results in the huma­ni­ties and social sciences can­not be reproduced[/pi_note]Larivée, S., Séné­chal, C., St-Onge, Z. & Sau­vé, M.-R. (2019). « Le biais de confir­ma­tion en recherche ». Revue de psy­choé­du­ca­tion, 48(1), 245–263[/pi_note].

The power of hypo­the­sis confir­ma­tion bias lies in the fact that it does not only concern the present but also… the past ! Indeed, we tend to ove­res­ti­mate the pro­ba­bi­li­ty of an event when we know that it has taken place : after the fact, we often behave as if the future were obvious to pre­dict (“that was bound to hap­pen”), and as if uncer­tain­ty or the unk­nown did not inter­vene in the events. This “retros­pec­tive” confir­ma­tion bias4 is all the more salient in tra­gic situa­tions and may explain cri­ti­cism of scien­tists’ or poli­ti­cians’ inten­tions once the human toll of a pan­de­mic, a ter­ro­rist attack or an eco­no­mic cri­sis is known.

The retros­pec­tive bias relies on the extra­or­di­na­ry capa­ci­ty of the human mind for ratio­na­li­sa­tion, i.e. the jus­ti­fi­ca­tion of events after the fact. We can never resist tel­ling our­selves a good sto­ry, even if it means dis­tor­ting rea­li­ty5. As a result, the fran­tic search for causes is pre­fer­red to simple cor­re­la­tions, pseu­do-cer­tain­ties to pro­ba­bi­li­ties, the denial of chance to the consi­de­ra­tion of hazards, dicho­to­mous thin­king to nuance, the ove­res­ti­ma­tion of low pro­ba­bi­li­ties to the neu­tral obser­va­tion of facts : pre­ci­se­ly the oppo­site of what the scien­ti­fic method teaches us.

Hard science vs. Humanities

Can the scien­ti­fic method be applied to the stu­dy of humans by humans ? In a vast series of research stu­dies in expe­ri­men­tal social psy­cho­lo­gy, Jean-Pierre Decon­chy and his team explo­red a fas­ci­na­ting sub­ject : the way huma­ni­ty thinks about huma­ni­ty, and the way huma­ni­ty thinks about the stu­dy of huma­ni­ty. With the help of inge­nious expe­ri­men­tal set-ups col­lec­ted in the year 2000 (publi­shed in Les ani­maux sur­na­tu­rés6), resear­chers sho­wed how, in the absence advan­ced scien­ti­fic culture, some of our cog­ni­tive fil­ters convince our­selves that our thoughts and beha­viours are not based on natu­ral deter­mi­nants. And that, conse­quent­ly, by vir­tue of these cog­ni­tive fil­ters, science would be unfit to unders­tand and explain deep human “nature”.

Thus, humans construct a defi­ni­tion of huma­ni­ty, which sepa­rates them­selves from the idea that they are crea­tures of nature, deter­mi­ned by the same laws as other living beings. And that behind this bio­lo­gi­cal form hides ano­ther “thing”, a “super-nature”, and thus a defiance of the very idea that science has a word to say on what huma­ni­ty is.

In this research, we find the idea of limi­ted ratio­na­li­ty, in the sense that the know­ledge of huma­ni­ty would be some­thing other than ratio­na­li­ty. It is also incre­dible to see that, at the same time as we pro­gress in cog­ni­tive and neu­ros­ciences, we are also wit­nes­sing seve­ral pseu­do-human sciences flou­rishes, adding a lit­tle extra soul to the “super-nature” stu­died by Decon­chy. These include a revi­val of sha­ma­nism, ener­ge­tic ‘medi­cine’ and per­so­nal deve­lop­ment tech­niques. They adopt scien­ti­fic voca­bu­la­ry that has an autho­ri­ta­tive (ano­ther cog­ni­tive bias) effect – some­thing that we have recent­ly seen in fan­ci­ful extra­po­la­tions clai­ming terms from quan­tum phy­sics to jus­ti­fy alter­na­tive medi­cines or other mys­te­rious phe­no­me­na7.

Thinking against oneself

Our brain draws quick and cheap conclu­sions to do us a favour. Most of the time, they are suf­fi­cient and rough­ly rele­vant to our imme­diate needs. But some­times, they do us a dis­ser­vice and lead us down a path that dis­cre­dits the very idea of free will. To fight against one­self, against the natu­ral slope of cog­ni­tive biases that wea­ken our dis­cern­ment, requires mini­mal trai­ning in what scien­ti­fic method is – not only for those who are des­ti­ned to a scien­ti­fic pro­fes­sion. It also requires an unders­tan­ding of the short­cuts our brain uses to make our lives easier, and some­times to lull us into an illu­sion of understanding.

Cha­ri­ties such as “La main à la pâte” (in France) and, more glo­bal­ly, the pro­jects dedi­ca­ted to scien­ti­fic outreach, in connec­tion with uni­ver­si­ties and research orga­ni­sa­tions, are fee­ding a real socie­tal need to rein­force psy­cho-social skills, not only of school­chil­dren but also of citi­zens. This is the price to pay so that science is not per­cei­ved as a belief like any other, so that doubt­ful or mis­lea­ding opi­nions do not take pre­ce­dence over the truth, and thus so that our demo­cra­cies main­tain their eman­ci­pa­to­ry skills.

1Vincent Ber­thet (2018), L’erreur est humaine. Aux fron­tières de la ratio­na­li­té. Paris, CNRS Edi­tions
2Ben­ja­min Mata­lon (1997). Décrire, expli­quer, pré­voir. Colin
3Klay­man, Joshua & Ha, Young-won (1987). “Confir­ma­tion, dis­con­fir­ma­tion, and infor­ma­tion in hypo­the­sis tes­ting”. Psy­cho­lo­gi­cal Review, 94 (2):211–228
4J. Baron &  J. Her­shey (1988), “Out­come bias in deci­sion eva­lua­tion”, Jour­nal of Per­so­na­li­ty and Social Psy­cho­lo­gy 54(4), pp 569–579
5Lio­nel Nac­cache (2020). Le ciné­ma inté­rieur. Pro­jec­tion pri­vée au cœur de la conscience. Odile Jacob
6Jean-Pierre Decon­chy (2000). Les ani­maux sur­na­tu­rés. Presses Uni­ver­si­taires de Gre­noble
7(8) Julien Bobroff (2019). « Sept idées fausses sur la phy­sique quan­tique ». The Conver­sa­tion, https://​the​con​ver​sa​tion​.com/​s​e​p​t​-​i​d​e​e​s​-​f​a​u​s​s​e​s​-​s​u​r​-​l​a​-​p​h​y​s​i​q​u​e​-​q​u​a​n​t​i​q​u​e​-​1​13517

Contributors

Patrice Georget

Patrice Georget

Lecturer in Psychosociology at the University School of Management IAE Caen

Patrice Georget is a lecturer and researcher in psycho-sociology at the IAE Caen University school of management, which he directed from 2015 to 2020. He has been an industry consultant in diversity management and risk prevention. He has been an expert for the APM (Association Progrès du Management) since 2009 and a GERME speaker.

Support accurate information rooted in the scientific method.

Donate