Menu Close

PISA education rankings are a problem that can’t be solved

Question 1: Where is the leaning tower of Pisa? Niall Carson/PA Archive

With a heavy feeling of déjà vu, here we are again with another round of introspection on the OECD’s international Program for International Student Assessment (PISA) rankings and the mediocre education children must therefore be receiving in the UK. Except this time, there’s a twist: maybe we’re not so awful after all.

In the latest news from PISA HQ, the United Kingdom (or to be more precise, England, as the sample was only from English schools) has scored relatively well on the new problem-solving test. A sample of nearly 1,500 15-year-olds in 137 schools scored on average 517 points, placing us well above the OECD average and 11th out of 28 participating nations. Or, bearing in mind common criticisms of the PISA tests, sixth if you only include whole countries of more than ten million.

According to PISA, UK students “perform significantly better in problem solving, on average, than students in other countries who show similar performance in mathematics, reading and science” and “this is particularly true among strong performers in mathematics”.

English 15-year-olds scored better than average. PISA 2012 Problem Solving, OECD

Looking at the results of other countries, the usual “Britain’s useless schools” headlines might not be the only narrative that gets challenged.

All the Chinese jurisdictions – some of which including Shanghai and Hong Kong are separated in on the PISA ranking – do worse than expected on “interactive tasks”. An obvious issue is that Shanghai, which although still a strong performer on the problem solving tests, fares very significantly worse compared to its scores in PISA tests on maths, reading and science.

Sweden, once Michael Gove’s “country of educational choice” confirmed its slide down the rankings. But perhaps the major standout is Poland, which, after its premier league performance in the conventional 2012 tests was being talked of as the next big thing. Now Poland is relegated back to the Championship with a score of only 481.

Change behind us

But just before we gloat too much, let’s bring ourselves down to earth with three important issues.

First, in terms of what this might mean for the UK, particularly with regard to the new national curriculum, the short answer is not much.

Too much is set in stone now for these results to have any major impact: the final version of the national curriculum has been published. Although problem-solving is worryingly absent from most of the document (and bizarrely, completely missing from science) it is a significant feature of the new maths programme of study. One of the three aims is that all pupils should “solve problems by applying their mathematics to a variety of routine and non-routine problems with increasing sophistication”.

The government has already decided on its post-PISA strategy and little is going to change with the publication of the problem solving strand.

Second, it’s crucial to remind ourselves that not only is the PISA methodology deeply flawed, but the idea of reading too much into the results is frankly laughable. As the mathematician Dr Hugh Morrison has said: “there are very few things you can summarise with a number and yet PISA claims to be able to capture a country’s entire education system in just three of them. It can’t be possible. It is madness.”

Yong Zhao, a respected Professor of Education at the University of Oregon, has written a series of scathing articles on the “illusionary” and “misleading” PISA tests, accusing them of “glorifying educational authoritarianism” and “romanticising misery” through their holding up of Shanghai-China in particular as the star the rest of the world must follow. I recommend them to anyone who wants to know why placing too much stall in anything PISA has to say is a highly dangerous game.

We can’t have it both ways. If our weak results have “scant validity”, so do our good ones.

Finally, it’s worth asking whether or not these results actually do as they say on the tin. PISA’s briefing on England asserts:

Across OECD countries, there has been a marked increase in recent decades in the share of jobs that require creative problem-solving skills … PISA’s first assessment of creative problem-solving skills shows how well-prepared students are to confront – and solve – the kinds of problems that are encountered almost daily in 21st century life.

Looking at the sample questions, this is a highly challengeable claim. The questions are better tests of pupils’ ability to read and follow instructions than they are about the genuine creativity that many employers now say they want.

If the practice tests are anything to go by, PISA is going to be great for encouraging people to be better at buying railway tickets and putting up thermostats, but perhaps not so good at finding radical and original ideas to unpredictable and non-mechanical issues.

In terms of problem solving, the biggest problem of all is once again highlighted by these results: PISA itself. It purports to be an accurate indicator of entire countries’ education systems. In this Svend Kreiner, professor of statistics at the University of Copenhagen, has said that: “the best we can say about PISA rankings is that they are useless.”

This won’t stop politicians from using them to justify their own policies, but at the very least this should encourage us all to challenge them when they do.

Want to write?

Write an article and join a growing community of more than 187,300 academics and researchers from 5,000 institutions.

Register now