Menu Close

NAPLAN doesn’t stand up to international tests

How do NAPLAN tests compare? Test image from

A new parliamentary report on the National Assessment Program – Literacy and Numeracy (NAPLAN) finally takes a long, hard look at the calibre of these controversial tests.

As part of the committee process, the Australian Curriculum, Assessment and Reporting Authority (ACARA), the body responsible for NAPLAN, recommended a number of reforms including online delivery, linking NAPLAN to the National Curriculum, reducing the time gap between testing and results, and introducing flexible delivery of the tests.

But missing from this list is perhaps the most important change: improving the test itself. NAPLAN needs to reflect a higher standard, especially when we compare it internationally.

How do we compare?

One simple way to see how NAPLAN stacks up globally is to compare results from the NAPLAN assessment to another international literacy test. Comparisons between NAPLAN and another reading test for Year 4 students (known as PIRLS) show that NAPLAN may not have the depth – or the benchmarks – to assess Australian schoolchildren in a global context.

For starters, the number of students reaching the minimum standard differ. Results published by the Australian Council for Educational Research (ACER) showed that more than one-quarter of the Australian Year 4 students who participated in PIRLS failed to meet the minimum international benchmark.

In contrast, the most recent round of NAPLAN results showed only 9% of Year 5 students did not meet the minimum national standard as the below figure shows. (The grey indicates students who are failing to meet the benchmark standard in literacy):

How is the same cohort faring? NAPLAN data from 2012; PIRLS data from 2011. Author/ACER/ACARA

On the international PIRLS test, Australia’s average score was similar to the score for Bulgaria, New Zealand, Slovenia, Austria, Lithuania and Poland. And significantly lower than the average score for 21 other countries, including the United States, England and Canada, Hong Kong and Singapore.

Comparing PIRLS results across countries. ACER ([see p.16 for a complete list of international results](

The same cohort of children took the two tests. So what is going on here?

There are three possible explanations: the NAPLAN standard is too low, PIRLS texts are more difficult, or PIRLS items are more challenging. All three, its seems, are true.

NAPLAN misses the international mark

Dr Sue Thomson from the Australian Council for Education Research (ACER), the body that oversees PIRLS, said recently that “almost everybody agrees that the NAPLAN standards are too low”. Even in sections where the two assessments are somewhat similar – for example, making inferences about fiction texts – students who are meeting the NAPLAN minimum are falling short of the international benchmark.

A spokesperson for ACARA said that it will “take account of international standards” when aligning the national assessment program to the national curriculum. It hopes the curriculum – mid-way through its roll-out – will also lead to “improved results”.

But teachers believe the longer PIRLS text turns students off. One Year 4 teacher explained: “they’re not used to reading such long periods of text”. While the PIRLS reader has only two stories that are nearly 800 words long, NAPLAN has seven stories, all less than 200 words.

There are two types of passages, informational and fiction.Readability statistics show PIRLS informational texts are harder to read.

Interestingly, PIRLS fiction texts are slightly easier than the NAPLAN literacy texts. This, too, brings a sobering point: even with easier texts, fewer Australian students are meeting the international benchmark once the word count goes beyond five paragraphs.

Although the NAPLAN tests are about ten items longer, the additional questions are easier recall questions. PIRLS asks more higher-order questions about its fiction texts and has more open-ended items.

Data from publicly available PIRLS (2011) and NAPLAN (2012) reading tests

Getting the results to principals

At an education department National Principals’ Conference in early March only ten out of 120 principals in the audience knew about the PIRLS results. This information needs to get to principals, teachers, parents and students in order for meaningful progress to occur.

Currently, international reading assessments have longer texts, harder non-fiction, deeper questions and higher benchmarks. When the Senate committee returns to the inquiry after the election, let’s hope that the content, nature and standards of NAPLAN plays a more central role in its recommendations.

If Australian students are to be held to a high international standard, NAPLAN needs to improve to become a world-class test.

Want to write?

Write an article and join a growing community of more than 126,900 academics and researchers from 4,020 institutions.

Register now