Menu Close

Bad numbers make for killer headlines – and dodgy news

Bogus certainty in the reporting of numbers can have dire consequences. rbbaird

Last week, The Guardian informed us the Eurozone Crisis will Cost World’s Poorest Countries US$238bn. Really? Not US$237 billion or US$239 billion? Perhaps it was just a wonky headline, and the article would be more nuanced. But no: the article soberly reported that the current crisis will cost US$238 billion – an impossible level of certainty for an economic forecast!

Does accuracy or inaccuracy of this sort really matter? Of course it does. Numbers often drive the news cycle, with consequences as far-reaching, or far-fetched, as the claims they seem to support.

Many examples come to mind. On October 18 2006, the venerable New York Times soberly declared that the 300 millionth American had arrived at 7:46am EST, according to the US Census Bureau.

Bogdan Bidutu.

There was no mention of the well-known fact that millions of Americans are uncounted (or indeed the level of uncertainty in any such count) because they are poor or undocumented. Even ignoring such uncertainties, the date and time could not possibly be specified.

Similarly, an October 31 2011 New York Times headline announced the arrival of the seven billionth human on the planet. At least in this case, the paper noted that, according to the best government censuses, there was a “window of uncertainty” of six months either way.

As stated, such numbers often drive the news cycle. On January 27 2009, CNN alarmed its American readers with the headline: Bank Bailout Could Cost US$4 Trillion. An alarming figure for the struggling taxpayer by any measure – it seemed solid and implacable, with little room for revision, except maybe upwards.

But according to a little-publicised April 2012 US Treasury report, all but US$60 billion of the original US$700 billion black hole in the US financial sector has been repaid, and it is expected that eventually all will be repaid, returning a net profit to the taxpayer. Not as newsworthy, perhaps? It was certainly not likely to have people frothing at the mouth and/or reading the accompanying article.

Ross Hong Kong

In a similar way, unemployment reports are volatile and are often dramatically revised after initial release – regardless of the furore that often accompanies their reporting. US figures are more frequently revised than Canadian and Australian figures, which are released with less haste.

On June 1, the US Labor Department reported the US economy added just 69,000 jobs, far fewer than the 165,000 predicted, and the same report slashed the April figure to 77,000, down from 115,000 in an earlier report.

Needless to say, employment reports are used by policy makers, investors and others worldwide to make very important decisions: deciding elections, driving countries into sovereign default, possibly making fortunes for a few while trimming the pensions of many. Indeed, recent employment figures in the US could become a deciding factor in the upcoming presidential election.

Misleading, or at least less-than-fully explicit, numerical reports are also rampant in reporting of health and environmental issues. The real numerical facts behind mad cow disease, oil and natural gas reserves, earthquakes caused by fracking, and the health effects from the Fukushima accident are often far different than understood even by the moderately well educated public.

eeekgirl

Some of this is inevitable since ambiguous headlines, much like uncertain witnesses, do not make cases. But even assuming pristine intentions, the discourse is nearly always hyperbolic and inimical to good policy making.

But could we have been cherry-picking here — recalling our favourite historical bloopers, and ignoring the much more numerous instances of responsible reporting? To test this hypothesis we decided to look online only for stories published at the end of last week, on June 22 and June 23. Here’s what we found:

1) The LA Times reported that employment in Orange County’s hotels, restaurants, theme parks and other tourism-related businesses hit “181,500 jobs” which exceeded the “181,400 jobs” figure in July 2008. Four significant digits?

(The number of significant figures in a result is just the number of figures that are known with some degree of reliability. The number 13.2 is said to have three significant figures. The number 13.20 is said to have four significant figures.

The rule-of-thumb is only to report the numbers you can trust, which depends on the context. The National Institute of Standards and Technology lists important physical constants, and indicates what has been measured with what level of uncertainty.)

2) The BBC told us bird flu “could mutate to cause deadly human pandemic”. Yet in the body of the article we read that:

The virus is […] deadly to humans but can only be transmitted by close contact with infected birds […] It is for this reason that relatively few people have died of bird flu. Latest World Health Organization (WHO) figures indicate 332 people have died of the illness since 2003.

3) The Chicago Sun-Times reported that 144,000 Medicare recipients in Illinois have received 50% discounts on brand-name drugs, for an average savings of “$667 per person.” Not $666 per person? Was this number just plucked from the air?

4) The Dallas Morning News ran with the headline: “Middle-income family spends $235,000 to raise baby.” The article subsequently made it clear that this was a current average figure, from a government report.

But the article also pointed out, referencing a US Department of Agriculture report, that the cost of raising a child was just over US$25,000 in 1960 – and this figure “would be $191,720 today when adjusted for inflation.” Five significant figures!

5) The Washington Post reported that the US Army supports “99 bands” and intends to spend US$221.1m on them in the coming year.

It’s hard to believe the US army has a perfect listing of all musical groups, much less a four-significant-digit figure for the exact amount to be spent on them all in the coming budget year.

Ian Boyd

6) The Toronto Star told us “Canada ranked 51st in access to information list.” The article never describes how the rating was determined, being satisfied with noting that Canada has dropped by 12 places.

It does tell us that “Serbia, India and Slovenia top the report’s ranking list, while Liechtenstein, Greece and Austria come last among the 89 countries with an access regime.” Does this mean we should all buy tickets to Serbia? Sadly that’s never explained.

We have written previously on the innumeracy crisis and its impact on the public’s assessment of risk. Let us finish here with a few suggestions for journalists and bloggers:

  • Avoid bogus certainty. “May cost … over $200 billion Euros” is just as informative and lot more more honest than “will cost … $238bn.”

  • Headlines should honestly reflect content. The Wall Street Journal’s headline, Global Warming Seen Lifting California Sea Level a Foot, distorts an otherwise good article which starts much more carefully:

    Global warming may push sea levels as much as a foot higher in California in the next two decades, threatening airports, freeways, ports and houses, according to a report examining risks along the US West Coast. Increases are forecast to be greatest south of Cape Mendocino, with levels rising 1.5 inches to 12 inches (4 to 30 centimeters) by 2030.

  • Talk about relative likelihood. Even with its cautious opening, the WSJ article made no attempt to quantify the probabilities involved or how much confidence even the researchers had in their analysis. This is now routine with opinion polls such as this Canadian poll, which ends:

“a randomly selected sample of 1,099 adult Canadians was interviewed online throughout the Ipsos online panel The margin of error is 3.1 percentage points, 19 times out of 20.”

Although we wonder how much the above means, even to educated readers. We also note that changes in polling methodology typically have much larger impacts that 3.1%.

Numbers are not holy water to sprinkle throughout news articles. That kind of profligate number inflation does nothing but confuse and complicate. On the other hand, careful use of well-explained numerical data can make or break an issue.

Next time you read any article with lots of numbers, you might ask yourself what they measure, how accurate they might be, and how they were determined.

But a word of caution: if you do that you may well have the unfortunate feeling that things just don’t add up.
A version of this article first appeared on Math Drudge.

Want to write?

Write an article and join a growing community of more than 182,100 academics and researchers from 4,941 institutions.

Register now