Back in 2012, MPs elected for the current parliamentary term were asked a simple question about the probability of flipping a coin and getting two heads in a row. The correct answer, as many of you will know, is 25%. Sadly, only 40% of the MPs got the right answer to that simple probability question.
Not everybody needs to be a maths whiz to be good at their job, and not everybody has an interest in statistics and probabilities. But when it comes to the people elected to run the country, who make decisions on billion-pound budgets and hold governments to account, it is surely reasonable to expect they have a basic grasp of the numbers.
MPs come across complex data sets on a daily basis – whether it’s about waiting lists in their constituency or investment in major national projects – a core part of the role must be asking good questions and assessing evidence before decisions are made.
In the five years since the last general election, we have seen numerous politicians falling short in their citation of statistics to one degree or another, with some of the worst offences being rebuked by the UK Statistics Authority. For example the shadow chancellor, Ed Balls, was taken to task for claiming a 400,000 rise in people on zero hours contracts when the numbers were not for comparable periods and changes may have been partly seasonal.
Sometimes this is down to bloody-minded politics, such as in the case of the work and pensions secretary, Iain Duncan Smith. After being told that his claim that the benefits cap had led to 8,000 claimants to get a job had no statistical evidence to support it, Duncan Smith responded to UKSA criticism of his figures by saying: “I have a belief I am right”.
Other times, the questionable use of numbers can be down to misunderstandings about chance, extrapolation or causation, with MPs misinterpreting (wilfully or otherwise) what is in front of them. Of course, many politicians are all-too happy to include helpful numbers in a speech or letter, but ignore the inconvenient ones. The problem is this leads to a lack of credibility: the figures used by politicians simply cannot be trusted.
That is certainly the public perception, at least. Polling commissioned by King’s College London in conjunction with the Royal Statistical Society and Ipsos-MORI showed that a full two-thirds of people disagreed with the statement that: “politicians use figures accurately when talking about their policies”. More than half said that they had no trust at all in the information provided by politicians. That is bad for statistics – and bad for democracy.
Getting parliament to count
Last week, the Royal Statistical Society launched its #ParliamentCounts campaign, encouraging voters to ask parliamentary candidates in their area to attend statistical training if they are elected. After the election the RSS will be contacting MPs to book in a session covering basic statistics and data handling, including pointers on how to avoid some of the common pitfalls.
The response to the campaign has been promising – dozens of MPs and parliamentary hopefuls have responded very positively. It’s a good start, though there are still thousands of candidates across the country who haven’t yet been asked.
Of course, it isn’t just elected officials who would benefit from statistical training. Senior bureaucrats and public servants should also be encouraged to do so, given the decisions they make on behalf of others. Investing in statistical training would help ensure they avoid making poor choices which affect citizens and waste public money.
This campaign won’t turn Westminster into a centre of statistical excellence overnight, but it would be a significant step in the right direction. It would also help set a precedent for future elections, in which a good grasp of the numbers is seen as a key skill by voters. Most of all, it would help MPs make better decisions for the people they represent.