Menu Close
How much of what they are telling you is true and how much is spin? ITV/PA Wire/PA Images

UK election 2019: public resistance to factchecking

Factchecking journalism has been a feature of UK elections since Channel 4 News launched its FactCheck in 2005. However, in the 2019 campaign, it has achieved a new prominence. This became particularly clear in late November when the Conservative Party press office renamed its Twitter account Factcheck UK. The stunt itself arguably reflected mainstream political awareness of the potential influence of factchecking.

Analysis of the 2019 campaign is still at an early stage, but preliminary findings suggest that it still faces challenges in popular reception. I found similar reactions in my research on the 2017 election, which examined the remit of fact-checking, its role in media debates and its reception on Twitter.

The three main national factchecking organisations – FactCheck, the BBC’s Reality Check and independent charity Full Fact – have been more active in live factchecking the televised debates with short items on rolling blogs, but less markedly so in terms of full articles explaining their factchecking verdicts: from my initial content analysis, 109 to date compared with 121 in 2017 (though each item might check multiple claims).

But in a significant departure from recent elections, both of the main broadcaster-based factcheckers have been given regular slots on flagship news programming, while Full Fact director Will Moy has made appearances on ITV and Talk Radio. This suggests that public interest in factchecking has overcome concerns that it is just not interesting enough.

Nonetheless, I found in my research that for some people encountering factchecking journalism on Twitter, the very idea of journalists ruling on the truthfulness of what a politician says is a subjective endeavour. They, therefore, often instinctively perceive the verdict to be biased. This is to confuse questions about determining truth with ethical questions of impartiality. In many cases, of course, this is simply an expression of their lack of trust in mainstream media in general and the BBC in particular.

‘Calling out’ bias

Labour’s claim that the Conservatives would send Trump £500 million a week in extra drug costs for the NHS provides a good illustration of the manifestation of these issues this election. In response to Reality Check’s factcheck on the claim, I counted 16 replies – almost one in ten, according to Twitter’s count of 172 replies – that referenced the spat over archive footage inserted into coverage of Johnson at Remembrance Day. Their intention would appear to be to undermine the credibility of the BBC as an arbiter of truth.

Three Twitter users replied to mock it as ironic or nonsensical, implying that the BBC could not be trusted to reflect “reality”. Responses included:

However, other Twitter users reacted negatively directly to the interpretive verdict that Labour’s claim was based on an “unlikely” scenario made by both Reality Check and Full Fact.

The trouble is that Labour’s claim is based on a prediction of the outcome of US trade negotiations and its impact. These are not, strictly speaking, hard facts as they have not yet occurred. Instead, they are based on, and extrapolated from, facts that are, to a greater or lesser extent, explored in the articles linked to from all three factcheckers’ tweets.

In these articles, the factcheckers explain where the claim came from – Andrew Hill, a contributor to a Channel 4 Dispatches investigation – and how it was calculated. Full Fact checked the figures Hill extrapolated from and linked to the Organisation for Economic Cooperation and Development and NHS sources. Whereas, FactCheck linked to Hill’s report, and Reality Check just set out the numbers.

Reality Check spoke directly to Hill and reported his acknowledgement that the figure is a “worst-case scenario” to illustrate what is at stake. While Full Fact described it more negatively in their headline as a “fairly extreme scenario”.

Reality Check pointed out that only 9% of drugs come directly from the US but couldn’t be sure what that meant for the estimate. Only FactCheck set out in any detail what is known about the US negotiating position and advantage and political pressures in both countries. Full Fact relied entirely on a single expert source, the Nuffield Trust, for its judgement (which was based on past trade deals, not current political intentions).

Taken together, all of the information presented helps the reader to judge how reasonable the claim is, though none are exhaustive. All give a clear sense that prices might go up, but it’s hard to say by how much and, more implicitly (or explicitly in FactCheck’s case), that Labour have overstated the certainty of a rough estimate of a hypothetical situation.

The terminology of a “factcheck” leads some to reject the verdict because it does not correct the figure – as it would with a factcheck of a factual measure or observation. For example, as one tweeter noted in response to Reality Check: “This isn’t a ‘factcheck’ as you’ve not provided any alternative factual numbers. This is purely a BBC ‘opinion piece’. And … it doesn’t feel very impartial!”

On the other hand, Labour opponents, this time replying to Full Fact, argued that an unproven statement was false and demanded a stronger judgement:

This (perhaps selective) desire for clear verdicts reflects an impulse by some Twitter users to weaponise factchecking as a campaigning tool. My research on the last election found that the most widely shared factchecking tweets were those that included a verdict, and especially verdicts critical of the Conservatives, such as Michael Fallon misstating the previous manifesto promise on army numbers (retweeted at the time by 909 people), and, to a lesser extent, favourable to Labour, such as Jeremny Corbyn being judged right on police cuts (retweeted by 604 people at the time).

However, while we all bring our political beliefs, values and attachments to our reading of media messages to some extent, it is not incompatible with a serious consideration of the arguments and evidence. My research has found that some Twitter users raise sensible criticisms of the factchecks, and this holds true for the above example. For instance, several highlight the issue that only 9% of drugs are sourced directly from the US (mostly attributed, interestingly, to broadcaster Andrew Neil rather than Reality Check, some even in reply to Reality Check). Others draw their own conclusions based on a reasonable interpretation of the evidence.

How factcheckers can respond

There is little that factcheckers can do about public scepticism when it is driven solely by motivated reasoning or the use of Twitter as a platform to propagandise for a favoured party. They can, however, respond to some of the more reasonable concerns about impartiality in interpretive judgements.

The example above highlights a particular problem with factchecking predictions, because they are by their very nature uncertain and contestable – and not in themselves factual. But they are also completely central to the decisions that voters need to make in an election. Factchecking journalism that restricted itself narrowly to the facts that politicians draw on, without explaining the extent to which those facts actually support the argument they are making, would be of no practical use.

The central problem, then, is that factchecking is not a very good description of what factcheckers actually do. What they do at their most effective, is assess the strength of arguments. FactCheck’s Twitter bio reflects this remit succinctly: “Testing the claims of people in power,” whereas Reality Check claims to “cut through the spin and concentrate on the facts” which gives a narrower perception of their legitimate remit.

Similarly, Reality Check is more cautious about going beyond statistical conclusions. It, therefore, leaves out the more complex political context that is needed to substantiate a judgement on political claims – whether they are serious attempts to indicate the outcome of proposed policies, or more speculative or rhetorical extrapolations. Greater clarity about the relationship between facts and arguments and exactly how this supports the verdict could increase voters’ confidence in the valuable work that factcheckers do.

Want to write?

Write an article and join a growing community of more than 178,900 academics and researchers from 4,895 institutions.

Register now