Image 20151207 22680 1agyu1.jpg?ixlib=rb 1.1

Are Australian universities getting better at research or at gaming the system?

Time to make research audits more transparent? www.shutterstock.com

Are Australian universities getting better at research or at gaming the system?

University research in Australia is improving, according to the latest round of results from the Excellence in Research for Australia (ERA) audit.

Every two to three years the ERA reviews hundreds of thousands of research papers from researchers in universities across Australia.

For each university, research in each field (such as psychology, chemistry, medicine and history) is rated from one to five stars, the later being what all universities strive for – “well above world-standard” research. These results determine how much research funding universities receive. It’s a big deal for institutions.

On the surface, it may look like the ERA exercise has achieved what it set out to do – improve Australia’s collective research performance.

However, research that I have been leading over the past five years – examining performance measurement in publicly funded services, including the ERA – suggests that we should be wary about how these results are being produced.

Growth, gaming or fraud?

I have become acquainted with the various pressures, professional responses and governance practices operating in universities from individual academics, to teams, to units, to executives.

My research shows that strategic gaming and what could appear to be fraud is systematically happening in universities as part of ERA processes. Universities construct submissions by allocating publications to fields of research (FoR) to demonstrate high research quality and quantity.

Consider the following cases:

In one university, one senior science executive explained that they performed so strongly in one field that they reclassified “surplus” publications to another field with the hope of increasing the second field’s ranking. For example, research in civil engineering might be reclassified as chemical engineering.

In another university, almost one half of research papers submitted for a professional discipline were not authored by members of that profession or in journals associated with that profession. The strategy was to artificially increase the size of research activity in a field to enhance their ERA rank. This is not uncommon. I’m aware that this practice has also been used by universities in the social sciences.

The ERA assesses research fields, not institutional departments. In another case, a university submitted research on the basis of a department in which it was undertaken, not the field of research it contributed to.

These strategic gaming practices are, however, not without risk.

The ERA rules limit the level of shifting of journal articles by linking FoR codes to journal titles, but still leaves considerable space for institutional discretion, particularly in books and research funding. Submitted data must also be scrutinised by ERA assessors and research evaluation committees.

In some cases, such gaming strategies have been detected by ERA processes. The ARC reportedly sent “please explains” to several universities.

I am also aware that some ERA evaluators did not reward reallocating research publications into a different discipline to increase its apparent size. However, sometimes the strategy pays off, with one institution receiving a five in a “gamed” research field.

To pretend strategic gaming does not happen – or that it will be discovered and punished, or is of no consequence – is sheer nonsense.

The creators of the ERA must think critically about what it actually is doing in Australia’s universities, and whether the many millions of dollars to run it are worth the cost.

The real question to the education minister, his department and the Australian Research Council (ARC), is what they will do about it?

How to move forward: make the process open to the public

One approach is to reduce the capacity for gaming within ERA processes.

A way to do this could be for the ARC to make universities’ ERA submissions publicly available.

At present, the ERA submissions are confidential and typically the submissions are created within institutions by executives and administrators with no accountability to the very researchers whose research performance data they manage, massage and submit.

Such transparency would provide external checks by academics who have a personal interest in their own discipline, and not the disciplines administrators deem their research to be strategically useful for.

It would also enable public shaming of institutions which cannot publicly justify ERA submissions.

Apply stricter rules for submitting research

Another option is to provide much stricter rules for allocating ERA input, such as only allowing publications to be submitted according to journals’ FoR codes or to authors’ self-identified FoR code.

Similarly, ERA rules could ensure that research funding can only be submitted into the fields of the investigators.

Academics and their unions should also be allowed to challenge the internal secrecy that typically operates within universities in the preparation of ERA submissions. Given the rise of the corporate managerial university, such an approach seems unlikely to gather much momentum.

How valuable is the ERA exercise?

A third approach is to question the value of the ERA exercise and to find new ways in which to enhance collective research quality and assessment.

The ERA process involves many millions of dollars. It also involves thousands of hours from academics in preparing and reviewing ERA submissions.

With all the data currently out there, how useful is the ERA process?

An alternative approach would be for the Australian government to require universities to systematically, publicly and regularly report their research inputs and outputs in a standardised format.

The ARC could commission research to analyse this publicly available data at a fraction of the cost of the ERA and under the quality control of academic peer review. This approach is much more suited to a 21st-century open government.