Whether discussing vaccination, climate science, the state of the budget or educational reform, it is common to hear calls for “the facts”. The appealing simplicity of the word “fact” is instrumental in its frequent employment as a persuasive device, but it is a deceptive siren call if you really want to change someone’s mind.
There is a strong tendency, particularly in those supporting a scientific position, to think that if only the facts could be made clear, people would follow the logical pathway to an inevitable and common conclusion.
We assume that our clarity of vision allows us to infer from a set of facts to a particular conclusion, and that those reaching alternative conclusions must be victims of bias or confused thinking. This often results in a conviction that saying something slower or louder will do the job.
Unfortunately, this is like trying to lift a mattress by one corner; there is very little movement given the effort involved. The delivery of factual information is a necessary condition to change minds. However, it is not always sufficient.
That we accept a view that someone else does not, or vice versa, is seldom a function of intellect or capacity to reason. It is better understood as a difference in prior beliefs. This is crucial to understand how our persuasive efforts can be improved.
One of the most successful theories of reasoning is Bayes’ theorem. While being a statement about the probability of something being true, it can be thought of as the likelihood of someone believing that a view is true (say a particular theory or hypothesis), having assigned an initial subjective probability to that view and then processing some evidence in support of that view.
The consequence of this, simply put, is that an old view with new evidence leads to an updated view. Bayesian reasoning is widely accepted as a rational means of modifying our beliefs about the world.
The likelihood of a rational person accepting a view after processing evidence in support of that view is a function of:
how likely they think the view is to be true before seeing the evidence (the prior belief); and
how likely they think the view is to explain or predict the evidence – that is, the strength of the connection between the evidence and the view.
What is often overlooked is that both of these contribute strongly to the formation of new beliefs. When used to calculate probabilities, they are multiplied together. If the first is weak, it requires a much stronger contribution from the second. If the first is strong, little impact from the second is needed.
What’s more, the second likelihood contains two specific requirements: that the evidence is accepted as true, and that the evidence is accepted as connected to the target view or belief.
The climate change ‘debate’
For example, consider providing a rational person who has a weak prior belief in anthropogenic global warming (AGW) with evidence of sea level rise and claiming that this is a result of AGW. Such a person could be less likely to accept both that sea levels are rising and that this rise is a result of AGW.
For the rational person, using Bayesian reasoning, it takes more to change an existing view than it does to reinforce it. A weak prior belief in a view (or a strong belief opposing it) implies that, for change to occur, the evidence in support of that view needs to be stronger than otherwise. Conversely, a stronger prior belief means ready acceptance of new evidence. Prior beliefs create an inertia that opposes change.
This explains why so much effort has been expended by those with financial or political reasons to oppose the science of AGW to provide evidence against it. If people have reasons not to change an initial view, then change is less likely, even if evidence for change is strong.
This might seem a little obvious, but remember that the value of understanding Bayesian reasoning lies in showing us the need to address prior beliefs - especially for rational people. It is prior beliefs, not necessarily a lack of clarity and reasoning, that determine differences in responding to evidence.
After all, it’s not as if belief in global warming is all about the science. It’s the prior beliefs that drive acceptance of or opposition to an idea.
Using facts as weapons is counterproductive
We cannot assume that a clinical injection of facts has the same effect on any two people. Their prior beliefs may differ and hence the Bayesian switch that changes our minds may not be thrown.
We should not, therefore, assume that anyone who doesn’t think like us is less intelligent or less capable of reason, and to treat him or her like that is deeply counter-productive. None of us comes to the party with a lack of prior beliefs.
Bayesian reasoning, seen in this way, reflects the dictum popularised by American astronomer Carl Sagan that extraordinary claims require extraordinary evidence. Fair enough, but what is extraordinary for one person may be ordinary for another, depending on prior beliefs.
Given that prior belief is often already in the mix, what we can do is focus on evidence and explain why that evidence is linked to the belief we want to create in others. This is a two-pronged attack that is not always employed, often because the connection between the evidence and the belief seems too complex. The importance of good communication, and often in this context good science communication, becomes apparent in the effort to change views.
Even more importantly, we need to frame new information in a way that is sensitive to prior beliefs. This is not to say it should be compromised, but merely delivered palatably.
Facts are too often weaponised, with the assumption that clear and forceful statements translate into ready acceptance. Unfortunately, for those whose minds are set against it, the opposite is likely to result.
To address both beliefs and reasoning, our arguments must be be as thorough, rigorous, accessible and respectful as possible. Just don’t forget to include the facts.