Science not plummeting in schools: report is ‘way out’

Watch your maths: an Australian Academy of Science report looks to be based around mistaken use or interpretation of numbers. Flickr/emdot.

The Federal Department of Education says it advised the Australian Academy of Science’s authors of a break in the series of student-numbers when it supplied the data. The lead author, Professor Denis Goodrum, questions this. See the Department’s statement below the main story and see also responses Professor Goodrum in the comment stream.

The Australian Academy of Science’s alarming claim this week that final-year science at high school is collapsing is itself collapsing due to the Academy’s evidence appearing to be plain wrong.

The gaffe has sparked concerns about a “cry wolf effect” that may undermine more credible efforts to build support for science education.

The Academy of Science warned that year 12 science studies are in a continuing slide from near universal levels of enrolment 20 years ago to barely half last year in a report, The Status and Quality of Year 11 and 12 Science in Australian Schools, that was released on Wednesday. The report, commissioned by Australia’s Chief Scientist, Professor Ian Chubb, made headlines here and abroad with its lead author, Professor Denis Goodrum, saying that “the overall drop in science study as a whole is quite staggering.”

However, as evidence, the report supplies tables that appear to display blatant errors or inconsistencies of calculation.

Professor Goodrum said that figures in the report puzzled the research team. “We’ve mulled through the obvious question why, but we don’t have an explanation. One can hypothesise on a whole range of things but you have to take the data - as we have - in good faith and work with it accordingly … you have to go back to the data that you’ve got.” Professor Goodrum said that he has asked the Department to clarify the situation, but had not yet received an answer.

However, science education analyst Dr Terry Lyons of the University of New England said that the report’s figures are “way out” and present an “exaggerated case of decline”. He said the problem was with how the data have been aggregated.

Speaking about the report’s claim that 94.1% of year 12 students took science in 1992, Dr Lyons says the real figure was significantly lower. “In the early nineties only around 65-70% of year 12 students studied a science subject. Examining the [report’s] tables, it is likely that the errors arose from the way the data were aggregated,” said Dr Lyons, who routinely analyses enrolment figures from the Department.

One fundamental error - in some years listed in report tables - was to simply add together the enrolments of each science subject, Dr Lyons said. “This approach ignores the fact that many students were enrolled in more than one science subject – consequently they were counted more than once. So instead of science students making up 92.82%, 80.87% and 76.33% of their cohorts in 1993, 1998 and 2001, for example, as claimed in Table 2.3, the actual percentages were 68%, 60% and 55% respectively. While this still represents an enrolment decline, it is certainly not anything like the decline claimed,” said Dr Lyons, Associate Director of Science Education at the National Centre of Science, Information and Communication Technology and Mathematics Education for Rural and Regional Australia.

Compounding this error is an apparent unexplained switch in how the annual numbers of students from 2002 onwards were calculated: “While Table 2.2 would suggest a total enrolment aggregate (not student number) of 142,923 in 2002, instead it is reported in Table 2.3 as 105,761,” Dr Lyons said. “It may be that these do not overrepresent students enrolled in multiple science subjects. Nevertheless, it would appear that the different calculation methods have resulted in an exaggerated rate of decline,” Dr Lyons said.

There were important lessons to learn from this gaffe, Dr Lyons said. “Everyone makes mistakes and while this one may be embarrassing for those involved, there is a positive side. It demonstrates the importance of scrutiny and peer review in the scientific process. If other researchers such as [NSW Chief Scientist] Professor Mary O’Kane had not queried the results, most people would have taken them at face value,” he said.

The public discussion sparked by erroneous information was showed that many people did take the report and its alarming headlines at face value, Dr Lyons said. The revelation that the report was more of a “cry wolf effect” could undermine public faith in science’s well-founded claims, and “may also undermine attempts to get the attention of policy makers,” he said.

Furthermore, it would be a shame if such fundamentally flawed reports as the Academy’s distracted from efforts to build up science in schools: “There is undoubtedly a problem in the level of science participation and engagement in schools, and it would be unfortunate if confusion over the scale of this problem deflected attention away from developing strategies to address it,” Dr Lyons said.

The Federal Department of Education, Employment and Workplace Relations (DEEWR) has supplied the following statement:

DEEWR confirms that the figures on numbers of students studying at least one tertiary accredited science subject in Year 12 supplied on page 11 of the Academy’s report are correct.

It should be noted, however, that there is a break in series between 2001 and 2002. Had the original series been continued, there would have been an extra 37,000 students included as studying at least one science subject in 2002 with similar numbers through to 2005. DEEWR made the report’s authors aware of this break in the series when it provided the data.

The data on the numbers of full-time Year 12 students are published in the Australian Bureau of Statistics publication Schools Australia, Catalogue 4221.0 each year. These data are available from the ABS website.