Are we headed for an educational disaster? Hardly

Two new international reports on school performance should be put into perspective. Education image from

The recent release of Australia’s performance in the TIMMS (2011) and PIRLS (2011) test results has sparked much media comment about what this means for the quality of Australian education.

The focus so far has been on the rankings in maths, science and reading and where Australian students fit on the ladder, but this is only part of the story.

Before we make dire predictions about what these rankings mean and what needs changing, some perspective is needed. We need to take a closer look at what these measures can really tell us about our school system – what’s useful data and what’s not.

Apples and oranges

First of all, rankings like these can be problematic if you’re only comparing apples with oranges. There is not much point comparing Australia with countries such as Singapore, Finland, Japan and Korea for example.

All these countries value different things in their education systems and all are also quite mono-cultural in makeup. For some countries, like Korea, these test scores influence educational policy. This has meant that essentially these education systems have grown to support students achieving high test scores.

As a consequence, Korea has had to legislate for a maximum number of hours students can spend in “cram school” – extracurricular schools that focus on tests.

This is the path Australian education has not taken and it shouldn’t. To follow these counties, and allow test results to impact on education policy would be a mistake.

We must remember, that these tests, which focus on content and associated skills that are common across different countries are quite narrow in their focus.

What they can tell us

But these tests can, of course, provide us with some useful information. In the most recent report, rankings have gotten all the attention, but in fact there was other valuable research done about what factors can influence student achievement, like family background.

Rankings, too, can tell us about how our students are faring in these areas, and what we may have emphasised or not since the last series of test.

One thing that has remained consistent in all of these international tests is the low percentage of students achieving at the highest level. These students are important as they not only start to achieve at their potential, but they are also an influential group in bringing the rest of their cohort along with them.

This trend of under-performing high achievers has been evident in the Australian data for some time.

Reading, writing and arithmetic

For the PIRLS reading test, this is the first time Australia has participated, so it’s not very helpful to look at Australia’s ranking. It is the first snapshot of this type that we have.

But this is not the case in Mathematics and Science, where we have participated for some time. The results show that some trends persist in Mathematics and Science. The profile of student achievement in Mathematics has shifted slightly and this may reflect different emphasis on different aspects of the curriculum over time.

In Science, achievement in physical sciences remains less than in Biology, with Earth Science a little more variable.

But once again, the context matters here. The focus on different disciplines in Science does not begin for many Australian students until Year 3. So results for Year 4 students in this study need to be regarded cautiously. Although, it does appear to improve by Year 8.

Bigger picture

The other focus of the report was a study of other factors in student achievement. This vaulable data looks at issues of gender, the resources at home, the resources at school, the background qualifications of teachers (which in Australia is quite high given that teachers must either have a 4 year undergraduate degree or a postgraduate qualification on top of an undergraduate degree), the years experience of teachers, the satisfaction of teachers and so on.

But nowhere is there a report on the quality of teachers as none of these measures can directly account for quality.

Interestingly for the Year 4 teachers in PIRLS testing, those with less than 5 years experience had students who achieved marginally better that any of the more experienced teachers. This is not the case for Science and a different scenario exists again for Mathematics.

Gender difference still exist throughout this data, as does resources at home and at school.

One of the take home messages from these reports is for Australia to unpack what does it indicate about our own students. What lessons can we learn about the highly individual students from very diverse backgrounds that make up our schools.

Such information is far more important than comparing us with countries who have different priorities for the students in their educational systems.