Monday, December 18, 2017

Thoughts on the problem of "Confirmation Bias"

Confirmation bias is the propensity of most human beings who have a strong opinion about something to note and take on board evidence which supports that view while discounting evidence against it.

In extreme cases, sometimes called the Backfire Effect, being presented with evidence against your beliefs can actually make those beliefs stronger.

This is one of a large number of inbuilt biases, some of which are described here, which we need to guard against.

The power of both confirmation bias and the backfire effect has been all too evident whenever I am unable to avoid discussing Brexit with a hardliner on either side - the sort who claims there is no valid argument whatsoever for the other position.

I happen to think there were good arguments both for and against leaving the EU and, looking at it objectively, I must be right in at least one of those beliefs, but one thing that the more hardline protagonists of both viewpoints have in common is that they are completely impervious to any argument they don't want to hear.

I was reading an article today on the subject of confirmation bias, posted earlier this year on the New Yorker site, called

"Why facts don't change our minds."

As this article points out, there is an evolutionary paradox in that both we and confirmation bias survive when it would appear prima facie that natural selection should have eliminated either this form of behaviour, or us.

The article gives the example of a mouse which had a confirmation bias in favour of the view that there were no cats in the vicinity and ignored evidence of a feline presence. If there actually were cats about, any rodent exhibiting such behaviour would be unable to pass on to a new generation the genes which led to that behaviour from the feline stomach in which it would shortly take up residence.

Therefore there must be a countervailing evolutionary advantage, and it is not obvious what it might be. The article suggests that being able to sustain an opinion in the face of any evidence might be of advantage in "winning one's corner" in debates with other human beings. There may something in this provided those debates were not about any existential threat.

My next thought might be that confirmation bias might be a factor in leading us to be more cautious - there are certainly real work decisions with asymmetric consequences where there is an evolutionary advantage in taking precautious.

To take a current example, consider climate change. I personally am not one of those who consider that the evidence for the majority scientific view that man-made climate change is a significant threat as been proven, or anything like it. But I also take the view that the consequences of not taking that view seriously and being wrong are so much worse than the consequences of believing it, and the evidence for man-made climate change, while falling short of proof,  is strong enough that under the precautionary principle we do have to take it seriously.

That could explain confirmation bias in favour of a cautious view, in this case or others explained in my recent Quote of the day from Eric Flint about the principles of Conservatism.

Unfortunately it does not explain confirmation bias in favour of more dangerous ideas such as the suggestion that Jeremy Corbyn might be worth giving a try as Prime Minister or that climate change is a myth.

I am wondering if  up to now confirmation bias in humans has survived because it makes likely a diversity of view among humans.

The problem with this is that on issues like climate change it may be less important to have a diverse view than to ensure that the majority view is correct.

No comments: