Confirmation bias

5 02 2015

Screenshot 2015-01-11 17.47.12

A few weeks ago we read our second chapter in the ‘Cognitive illusions’ book: one on ‘confirmation bias’ by Margit Oswald and Stefan Grosjean (who google is telling me is a soft porn photographer: can this be true?).

It opened with a game: “I made up a rule for the construction os sequences of numbers. For instance the three numbers ‘2-4-6’ satisfies this rule. Give me other sequences of three numbers you think meet my rule, and I’ll tell you if you’re right or not”. Mine were 8-10-12 and then 1-3-5, after which I feebly ran out of inspiration. Turns out the rule is just “increasing numbers”, and that what people most often do is generate strings of numbers they thought would meet the rule, rather than testing the rule by coming up with strings they thought would break it. This testing strategy tends to lead to a spurious confirmation of their own hypothesis, even though it was likely to be wrong (and is rather like induction).

Much of the subsequent discussion was about whether this represents a form of confirmation bias – the way we search preferentially for information that supports our pre-existing ideas – or instead a ‘positive test strategy’ (a fairly sensible way of restricting what questions one asks when trying to solve a problem, and which some confuse with bias).

Some incredible examples of real confirmation bias were also given. For example, if people who are for or against the death penalty are asked to read scientific evidence both for and against the death penalty, they are so good at picking out nuggets that support their pre-existing views, and at reading ambiguous evidence as being for their views, that both camps emerge more convinced than ever that their own views are right (despite reading exactly the same material).

What causes such biases? The mechanisms invoke what we look for and pay attention to; what we remember afterwards; and how we interpret anything with a bit of wiggle room. But these biased processe in turn can reflect what we have to gain or lose by having our pre-existing views overturned: in other words, the costs of having a hypothesis we’re fond of rejected.  This is a source of hope, happily – most of us do not stubbornly cleave to our views in the face of mounting contradictory evidence, especially if the costs of doing so (e.g. ridicule; publishing work that is wrong; etc.) are high.