Menu Close

Training to reduce cognitive bias may improve decision making after all

Cognitive bias is alleged to have contributed to the downing of the Iran Air Flight 655 by the US navy, which killed 290 people in July 1988. Bandar Abbas/AFP

Ever since Daniel Kahneman and Amos Tversky formalised the concept of cognitive bias in 1972, most empirical evidence has given credence to the claim that our brain is incapable of improving our decision-making abilities. Cognitive bias has practical ramifications beyond private life, extending to professional domains including business, military operations, political policy, and medicine.

Some of the clearest examples of the effects of bias on consequential decisions have occurred in military operations. Confirmation bias, that is the tendency to conduct a biased search for and interpretation of evidence in support of our hypotheses and beliefs, has contributed to the downing of Iran Air Flight 655 in 1988 and, more recently, the decision to invade Iraq in 2003. It has also been identified as one of the most deleterious biases on social media, actively contributing to the development of polarisation and echo chambers in exchanges.

Can one bend one’s intuition?

Despite all the attention in recent years on reducing cognitive bias, most evidence suggests that there’s little we can do to improve our professional and personal decision making. But a recent experiment suggests that it may be possible for training to improve decision making in the field.

We are regularly reminded of the many ways that cognitive biases interfere with our decision making. However, beyond teaching a specific skill or rule – for example, how to calculate expected values – reading articles and books or even completing courses and business cases has proven of little help to people in the throes of making a decision. That conclusion was succinctly summarised by Daniel Kahneman, a Nobel Laureate and a founder of the field and, who said in a 2018 interview:

“You can’t improve intuition. Perhaps, with very long-term training, lots of talk, and exposure to behavioural economics, what you can do is cue reasoning… Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument, the rules go out the window.”

That view is backed up by a trail of frustrating findings from the 1980s on, where even trained experts such as doctors, realtors and philosophers did not show improved decision making when faced with novel contexts and problems in the field.

In an article published in Psychological Science, we report promising results that suggest this post-mortem may be premature. In an experiment involving graduate business students, we found that bias-reduction training can improve decision making in field settings even though reminders of bias are absent.

Training sessions and computer games

The experiment was designed to surreptitiously measure the influence of a single de-bias training intervention – the tendency to search for evidence confirming hypotheses and ideas we already suspect or believe to be true, to overweight facts and ideas that support that belief, and to discount or ignore evidence that supports alternate hypotheses.

A little more than half of participants in the experiment (62%) were given the training and then asked to complete a business case designed to test confirmation bias, but they were unaware of the connection between the training and the case. The rest of participants first completed the case and then received training. Even though the time lag between training and the case averaged 18 days and the structure of problems used in the training differed from the case, comparison of the trained and untrained students revealed that training reduced choice of the inferior, hypothesis-confirming case solution by 29%.

To disguise the relationship between training and the case, all graduate business students in three programs were invited to play a serious computer game in a set of sessions that took place over a 20-day window. This particular training intervention has produced large and long-lasting reductions of confirmation bias, correspondence bias, and the bias blind spot, in laboratory contexts. Originally created for the Office of the Director of National Intelligence, it has been used to reduce bias in US government intelligence analysts.

Imagining you’re leading an automotive racing team

All graduate students in the three programs also completed, in one of their courses, an unannounced business case known as “Carter Racing”, a case modelled on the fatal decision to launch the Challenger space shuttle in 1986. Here, each student acted as the lead of an automotive racing team making a high-stakes, go/no-go decision: remain in a race or withdraw from it. We then used natural variance in the training schedule to test whether the effects of debias training would transfer to improved decision making in the case, when trainees were not aware that their decision making would be examined for bias.

At first sight, the case narrative and payoff structure favour the hypothesis-confirming choice: remaining in the race. A careful examination of the data provided in the case, however, reveals that withdrawing from the race is an objectively superior option, but it requires the compilation of two charts. The first chart tracks frequencies of engine failures in relation to temperature at the time of the race. The other chart tracks frequencies of races without engine failures by temperatures at the time of the race. Casual inspection of either chart would not reveal the clear relationship between failures and temperature, but when both charts are considered together, the relationship is strikingly clear. A catastrophic engine failure is nearly certain at the low temperature recorded just before the race is to begin.

Participants trained before completing the case were 29% less likely to choose the inferior hypothesis-confirming solution than participants trained after completing the case. To address possible selection biases, such as better students signing up for earlier training sessions, we tested and found that the effect held even if we only compared participants who completed the training one day before or after the case. Further, when controlling for factors including students’ work experience, age, grade point averages, GMAT scores, and propensity to engage in cognitive reflection, we found that the training intervention still significantly improved decision making.

Our analyses of participants’ written justifications for their decisions suggest that their improved decisions were driven by a reduction in confirmatory hypothesis testing. Trained participants spontaneously generated fewer arguments in support of going ahead with the race – the inferior case solution – than did untrained participants.

Improvement is possible

These results provide encouraging evidence that training can improve decision making in the field and consequential decisions in professional and personal life. It also addresses the concern that debiasing training may lead people to overcorrect or abandon heuristics, the simple rules people rely on to reduce the complexity and effort when making decisions that sometimes produce these biases, in situations where they are useful. Trained participants were more likely to choose the optimal case solution, so training benefited rather than impaired decision making.

Of course, these findings are limited to a single field experiment. More research is needed to replicate the effect in other domains and to explain why this game-based training intervention transferred more effectively than have other forms of training tested by past research. Games may be more engaging training interventions than lectures or written summaries of research findings. The game also provided intensive practice and personalised feedback, which is another possibility. A third possibility is the way the intervention taught players about biases. Training may be more effective when it describes cognitive biases and how to mitigate them at an abstract level, and then gives trainees immediate practice testing out their new knowledge on different problems and contexts.

Want to write?

Write an article and join a growing community of more than 182,600 academics and researchers from 4,945 institutions.

Register now