In Australian universities at the moment research is everything. They obsess over the rankings in the new ERA system which measures research performance. For academics publishing in the top journals isn’t just part of playing the game, it’s the whole game.
But there are things that the rankings don’t tell you, and valuable work that research-obsessed university administrators currently don’t recognise.
It’s another example of the measurement system changing the ways people behave, for the worse. And the unintended consequences of this unhealthy research obsession are holding us back.
It’s the research, stupid
The accepted way to measure academic performance has become research output. Excellence in Research for Australia (ERA), the government funding agency for research, distributes the resources which shape the way Australia presents its knowledge to the world.
Getting a slice of the $510 million Sustainable Research Excellence program has become the holy grail for many university administrators. But it ignores the hard work being done teaching the next generation.
There is already evidence that research assessment exercises overseas have amplified the swing to more research emphasis in promotion policies. Take this report from Alan Jenkins and Graham Gibbs in the Guardian on 15th August 1995.
“A survey conducted by the Oxford Centre for Staff Development in the UK showed that while 96% of institutions included excellence in teaching amongst promotion criteria, only 11% of promotion decisions were made on teaching grounds. Another 38% of institutions reported never having promoted someone primarily on the grounds of teaching excellence. Written responses included ‘Not in living memory’, and the wonderfully disdainful ‘Not at this institution’. One respondent reported: `There are three criteria for appointment here, research, research and research’”
In practice this may mean further concentrating research in a small number of institutions and perhaps the emergence of “teaching-only” departments or even universities.
Given the federal government’s commitment to increase significantly the participation rate of school leavers in higher education, many of these students may end up in research poor environments.
We risk creating a strange mixture of elitism and egalitarianism.
Universities will be able to use ERA performance to guide the allocation of resources as well as invest in future skills.
How important is your journal?
Many people spend their lives getting that paper into that journal. Name dropping matters. To be taken seriously, and enjoy the funding benefits, you have to be published in the key journals in your field. Risk taking is avoided. To get published you have to cite those who have trodden that path before.
But the 2010 journal rankings seriously devalue various interdisciplinary research fields and could damage Australia’s strong international reputation in these fields.
Interdisciplinary research is often where the breakthroughs come.
So where does that leave those whose work is innovative? Multidisciplinary researchers are literally thinking outside of their academic box, and they’re being penalised.
In areas like the humanities, arts and social sciences, it’s tricky to assess the quality of research in the way that ERA requires. Its way of examining citation lists is not well attuned to measuring interdisciplinary research and cross-sector collaboration.
And that’s not all. International journals are favoured over local ones but they are not necessarily interested in publishing Australia-specific research. Thus, under the ERA, we applaud someone for publishing in an international journal, rather than recognise efforts to contribute knowledge and participate in debates in Australia.
A case in point is the potential effects on research on the not-for-profit sector, where journals with lower impact measures, circulation and much higher acceptance rates are outranking the most longstanding and prestigious international journals in the field.
Writing off the not-for-profit sector
The limitations of ERA in measuring the quality of multidisciplinary research is glaringly obvious in the not-for-profit studies.
It’s a rapidly growing area of research and is at the centre of many contemporary public policy debates.
According to the 2006/07 ABS data, the Australian not-for-profit sector:
• Had 41,000 economically significant not-for-profit organisations.
• Employed 890,000 people, 8.6% of Australians in employment.
• Had an income of $76 billion.
• Contributed $34 billion, or 3.4%, to GDP.
• Made an economic contribution equivalent to that of the government administration and defence industry and one and a half times the economic contribution of the agriculture industry.
• Had over 13 million Australians (86% of adults) belonging to at least one not-for-profit association, with 48% belonging to at least three.
• Had just under 1 million Australians holding office in a not-for-profit organisation.
But if there is no one in Australia researching not-for-profit or volunteering, it is highly likely that there will be no undergraduate, or even postgraduate courses on these subjects. This potentially under serves 800,000 paid employees and over 6 million volunteers – Australia’s largest workforce!
The peer review process - an imperfect science
The panels reviewing journals could be more transparent about how they actually operated and what was done with their recommendations.
In the case of not-for-profit research, all submissions were made through the Australian Business Deans Council (ABDC) joint submission. They ranked the not-for-profit journals appropriately on their own list but they were all then downgraded in the final ERA list.
It is difficult to know what happened. Maybe the ABCD’s advice was rejected or maybe they didn’t defend not-for-profit research strongly enough. I hope the current round of consultation will provide greater access to those groups that might better represent those engaged in multidisciplinary research.
Why fields of research codes don’t tell the whole story
If your work falls neatly into a Field of Research code (FoR) you’re in luck. ERA can easily identify what you’re doing. But if you collaborate outside the strict parameters of the FoR and end up with various four-digit research fields a valued ERA score is less easy to reach. This strengthens disciplinary “silos” while multidisciplinary approaches become invisible.
This was the case for non-profit studies. It has no distinct FoR and internationally esteemed multidisciplinary not-for-profit journals were outranked by less prestigious, subject specific and far less cited marketing, management and accounting journals.
A way forward would be to allocate not-for-profit research a FoR code, in much the same way as the ARC has given other multidisciplinary areas a code, such as tourism studies (which has its own category but only employs half the number of workers of the not-for-profit sector).
Other countries have Fields of Research or equivalent that separately identify not-for-profit and voluntary sector studies, so there is a strong case for international compatibility.
What will happen if we don’t take action?
Teaching and research are interdependent. Research productivity significantly adds to both the quality and substance of classroom teaching and teaching adds to the quality of research, not least because it allows for the (often valuable) input of students.
The ERA ratings say nothing about teaching excellence. For this reason it is likely that the ERA will further intensify the research culture in many university departments, probably at the expense of teaching.
The division created within departments between researchers and teachers can lead to them being unable to function as communities of scholars. Instead they become the setting of game playing for some and the home of resentment and bitterness for others.
By devaluing the top journals in multidisciplinary fields of research, we are on a path that leads us away from accepted international best practice – just when we need more than ever to ensure that our researchers have international standing.
It is time not-for-profit sector researchers call on the ARC to revise the 2010 Journal Ranking list to recognise this important field of research.