Declining NAPLAN participation rates are likely skewing the data

It is usually students in the lower band who are absent from NAPLAN testing so results may be inflated. Stefan Postles/AAP

The National Assessment Program for Literacy and Numeracy (NAPLAN) is supposed to assess all Australian students in Years 3, 5, 7 and 9 to see how schools and students are performing against a national average.

The data is used to inform education policy and strategies to improve student learning in literacy and numeracy. Despite continually being spouted as otherwise, NAPLAN has become a high-stakes test for schools, with school, sector and system data released for public scrutiny. The result of this is that students, teachers, schools and systems are held to account, both positively and negatively, for the results that are achieved.

Since its inception, rates of student participation in NAPLAN tests have been in steady decline each year. There are a number of ways that students can be non-participants in one or more of the tests.

Exempt, absent or withdrawn

There are three options for student non-participation in NAPLAN.

A student may, through school and parent/carer consultation, be exempt from sitting the tests. This form of exclusion applies to students who have recently (within 12 months) arrived in Australia from a non-English-speaking country, or who have a disability (though the definition of “disability” for this purpose is not clear).

Absenteeism is self-explanatory and a decision is made by the parent/carer or, perhaps, the student.

Withdrawing a student from NAPLAN requires parents/carers to formally remove their child from the test. Two prevailing reasons for this are philosophical or religious grounds. However, parents/carers are able to withdraw their child without having to provide an explanation.

Data on student non-participation has been available since 2009. To 2014, participation has fallen by 1.6% across each of the year levels sitting the tests. Each year has seen a decline in participation, with 2014 having the most students not participating across each of the year levels.

Participation in NAPLAN testing has dropped every year since it started. Dean Lewins AAP

In 2014, the number of non-participating students increased by more than 4000 in each of the year levels from 2009 figures. Along with this, the number exempted from the tests has also increased steadily each year.

This downward trend in participation is expected to continue as the 2015 NAPLAN data is released. Why is this happening?

To sit or not to sit?

This is a question many parents grapple with. The issue of test anxiety is a big one. Many schools report that NAPLAN has a negative impact on their students, especially those at primary school.

Asking young students to perform in high-stakes testing can have a long-term detrimental effect on their psychological wellbeing. If a student, while still developing an attributional capacity for success and failure, scores poorly in a high-stakes test, their prospects for later educational progress may be hampered.

A 2013 study of US elementary students doing high-stakes testing showed clear evidence that participating in national tests at a young age was stressful. The students involved showed statistically higher indicators for anxiety.

Withdrawal of students on philosophical grounds is also on the rise. Parents/carers are choosing to withdraw their children for a wide range of reasons, although data on the specific reasons is not available from the national curriculum authority (ACARA).

In April, ACARA’s general manager of assessment and reporting, Stanley Rabinowitz, released an online message aimed at parents/carers to encourage participation. He said that NAPLAN:

… gives us a national snapshot of how children are performing.

However, with the current trend in increasing rates of parents/carers opting their children out of NAPLAN, how accurate is the data?

The affect of non-participation on the data

The data provided by ACARA about student participation rates does not include students who are exempt from sitting the tests. This means that only students who are withdrawn or absent on the dates of testing are counted as having not participated.

While students who are exempt are counted as not achieving the National Minimum Standard, they do not provide data that contributes to the mean scores for the nation in the tested areas.

It is estimated that more than one-third of students identified as having additional educational needs do not participate in NAPLAN. This means the data we have on non-participation does not include all those students who do not sit the tests. Despite this, the non-participation rates still impact the data.

To counteract these declining participation rates, data is imputed for students who are withdrawn or absent. These scores are an estimate drawn from students of similar backgrounds who do sit the tests. However, students who are absent or withdrawn from NAPLAN tend be lower-performing students.

This means that the scores imputed into the data are likely to be higher than they would have been had the non-participating students sat the tests. The impact of this is inflated data that is not representative of Australian students.

Along with this, the increasing numbers of students not participating in the tests cause issues for the well-publicised comparisons between schools, as well as for the comparisons that are made between states and territories.

For many reasons, parents/carers and students are choosing not to participate in NAPLAN. This trend has been seen in both the US and UK in relation to high-stakes testing. It is likely to continue in Australia.

Given the concern about the impact of this on the data, perhaps it is time that ACARA begins to look at other, more effective ways of “measuring” the capabilities of all our students.