A couple weeks ago I published a guide to cognitive biases for journalists. I saved perhaps the biggest one of all — confirmation bias — for a post all of its own. It might be one of the best-known biases, but for that very reason it can be easy to underestimate. Here, then, is what you need to know — and what to do to reduce it.
What is confirmation bias — and how does it affect journalism?
Confirmation bias is the tendency to seek out — or more easily believe or recall — information that confirms our existing beliefs.
It leads us to make judgements that are not based on an equal assessment of all the evidence, but only that evidence we have cherry picked, remembered or attributed more credibility to.
Confirmation bias affects journalists in at least three ways:
- It affects reporters and the way that we pursue stories
- It affects our audiences and the way that they interpret and use our reporting
- And it affects our sources in the way that they present information to us
Each aspect deserves separate consideration, and different approaches.
How confirmation bias affects reporting
When up against a tight deadline journalists often target newsgathering towards those sources which can most quickly confirm the story already in our head; the “hunch”.
Contradictory information from other sources, after all, might complicate and slow down reporting.
As University of Virginia psychologist Jonathan Haidt expresses it:
“We may think we’re being scientists, but we’re actually being lawyers (PDF). Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.”
Journalism already has procedures for avoiding this. The principles of objectivity and impartiality have been justifiably criticised for resulting in a “view from nowhere” — but seeking “both sides to the story” exists at least in part to counter confirmation bias (and indeed negativity bias, covered in the previous post).
Science has similar techniques, such as the null hypothesis, which effectively involves a scientist trying to prove, counter-intuitively, that there’s “nothing to report here” to avoid focusing too narrowly on establishing a cause-and-effect relationship.
As one description puts it:
“Usually, the null hypothesis is boring and the alternative hypothesis is interesting.”
Newsrooms have our own phrase for this: “conspiracy or cock-up?” It is used to highlight the tendency, when reporting on problems, to assume that those in power have intentiionally caused them.
Instead, it is worth considering that simple human error or incompetence might be an equally likely cause (still an important story to report).
All these techniques are useful to employ in countering your own natural tendency to look for information that confirms what you believe the story is about.
When you do, watch out for another cognitive bias: the tendency to think that a middle ground between two extremes is the truth.
Confirmation bias and sources
It’s worth remembering that sources are just as vulnerable to confirmation bias as we are — whether it’s intentional cherry-picking or a tendency to only remember information that confirms their own worldview, or to attribute extra authority to that.
Factoring this into our interview approach can improve our reporting — including anticipating how we deal with cherry-picked information (e.g. diplomatically or adversarially), ensuring that we have done enough background research to understand the strengths and weaknesses of information that sources might draw on — and follow-up research afterwards.
It also means when a source dismisses other sources of information we should retain some scepticism about their (unconscious) reasons for doing so. Does it challenge their self-image or authority in some way?
When dealing with organisations presenting research it is important to be aware of some of the practices used to avoid confirmation bias (such as the null hypothesis). An awareness of p-hacking (also called data dredging) also helps.
Confirmation bias and audiences
Journalists are especially familiar with confirmation bias because we know our audiences exhibit this behaviour — indeed many of us will work in news organisations that pander to it, either explicitly or implicitly (for example, failing to run stories which challenge the audience’s worldview).
But what’s less well-known about confirmation bias is that it not only applies to information that we seek out — when exposed to contradictory information we are also more likely to remember information that confirms our existing beliefs, or even misremember information.
Importantly, people don’t just misremember information that fits their worldviews – they are also share that misinformation with others, producing downstream effects.
A new study suggests that people may be the source of their own fake news. Comm Prof Jason Coronel found that “people can self-generate their own misinformation. It doesn’t all come from external sources.” http://bit.ly/2RGEFhx
There are no simple answers to this right now, but one of the most promising attempts I’ve seen so far is the ‘You Draw It‘ feature launched by the New York Times and adopted by a number of other news organisations including The Guardian and BBC.
Early evidence suggests the approach performs well in improving user recall, and I’d like to see how it performs in terms of cognitive bias, given how it diplomatically avoids confronting false beliefs. You can create your own using TheyDrawIt.
“Sophisticated messages deserve more attention, not necessarily of higher quantity but probably of higher quality … people get more vigilant and make fewer errors, but meanwhile become less creative and feel more effortful.”
If you are aware of other techniques for tackling confirmation bias I’d be very interested in hearing them — and I promise to approach them with an open mind…