tag:theconversation.com,2011:/au/topics/era-860/articlesERA – The Conversation2021-08-15T19:54:42Ztag:theconversation.com,2011:article/1656222021-08-15T19:54:42Z2021-08-15T19:54:42ZWhere is the evidence for ERA? Time’s up for Australia’s research evaluation system<figure><img src="https://images.theconversation.com/files/415177/original/file-20210809-27-1vlmjgb.jpg?ixlib=rb-1.1.0&rect=0%2C181%2C4500%2C2997&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/australia-high-resolution-excellence-concept-189296396">Shutterstock</a></span></figcaption></figure><p>Research at Australian universities has been scrutinised through the Australian Research Council’s (ARC) assessment exercise, <a href="https://www.arc.gov.au/excellence-research-australia">Excellence in Research for Australia</a>, since 2010. </p>
<p>A companion <a href="https://www.arc.gov.au/engagement-and-impact-assessment">Engagement and Impact Assessment</a> exercise began in 2018. The <a href="https://blogs.lse.ac.uk/impactofsocialsciences/2018/03/13/the-hidden-costs-of-research-assessment-exercises-the-curious-case-of-australia/">time and costs for universities</a> of running these exercises (the ARC collected this information when ERA began but never released it) and the value they generate for universities, government, industry and the public are unknown. </p>
<p>It’s difficult to see how any future versions can be justified without evidence of a healthy return on investment. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/starting-next-year-universities-have-to-prove-their-research-has-real-world-impact-87252">Starting next year, universities have to prove their research has real-world impact</a>
</strong>
</em>
</p>
<hr>
<p>The question of future assessment exercises is now in the spotlight. The ARC recently completed a <a href="https://www.arc.gov.au/excellence-research-australia/era-ei-review">review of ERA and EIA</a> to “ensure the national research assessments address Australia’s future needs”. </p>
<p>The review’s <a href="https://online.flippingbook.com/view/940831/5/">terms of reference</a> included consideration of “the purpose and value of research evaluation, including how it can further contribute to the Government’s science, research and innovation agendas”. This is important, as no evidence has ever been provided of exactly how the government, industry or community uses assessments for informing agendas. </p>
<p>The review received <a href="https://www.arc.gov.au/excellence-research-australia/era-ei-review#table">112 submissions</a> in response to a <a href="https://online.flippingbook.com/view/940831/">consultation paper</a>. Most came from universities, peak bodies/associations and various service providers and consultants. No responses were received from the sectors that supposedly benefit from these exercises, namely government, industry and the community.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1405286939163385856"}"></div></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/who-cares-about-university-research-the-answer-depends-on-its-impacts-149817">Who cares about university research? The answer depends on its impacts</a>
</strong>
</em>
</p>
<hr>
<h2>What are the issues with the system?</h2>
<p>A review advisory committee was then appointed to consider key issues and make recommendations to the ARC CEO. The committee readily identified key concerns about how the assessments work, such as rating scales, streamlining and automation, evaluation cycles and eligibility requirements. These matters also came up in university submissions. </p>
<p>But what came through most clearly from universities were the mixed views about the value of assessments as a whole. By extension, there is a question mark over whether they should continue if their utility cannot be clearly demonstrated. </p>
<p>While EIA has been run only once, there have now been four rounds of ERA overseen by four different ministers. Each round has culminated in a detailed national report with a minister’s foreword that consistently focuses on the same two matters: </p>
<ul>
<li>ERA results provide assurance of the government’s investment in the research sector</li>
<li>the results will inform and guide future strategies and investments. </li>
</ul>
<p>In other words, there has been an overreaching focus on <em>justification</em> for the exercise and on its purported <em>utility</em>. But how convincing is this? </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1420506716425256962"}"></div></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/explainer-how-and-why-is-research-assessed-36895">Explainer: how and why is research assessed?</a>
</strong>
</em>
</p>
<hr>
<h2>ERA is past its use-by date</h2>
<p>In its early days, ERA was credited with playing an important role in focusing university efforts on lifting research performance. Indeed, a number of university submissions to the review acknowledged this. </p>
<p>However, much has changed since then. As university responses noted, new databases and digital tools – together with greater expertise in data analytics within universities to analyse performance – as well as the impact of international benchmarking through university and subject rankings have meant ERA’s influence has dramatically dwindled. Universities no longer need an <a href="https://melbourne-cshe.unimelb.edu.au/lh-martin-institute/fellow-voices/australian-research-council-review">outdated assessment exercise</a> to tell them how they are performing.</p>
<p>As for its actual application, there was a brief time when ERA informed funding allocations under the <a href="https://www.dese.gov.au/download/3009/2016-sre-process-calculations/4218/document/pdf">Sustainable Research Excellence for Universities scheme</a>. It was one of a number of schemes through which government support for university research was based on their performance. But this was quickly abandoned. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/415754/original/file-20210811-21-1gkg71c.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="screenshot from archived ERA web page on ARC website" src="https://images.theconversation.com/files/415754/original/file-20210811-21-1gkg71c.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/415754/original/file-20210811-21-1gkg71c.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=100&fit=crop&dpr=1 600w, https://images.theconversation.com/files/415754/original/file-20210811-21-1gkg71c.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=100&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/415754/original/file-20210811-21-1gkg71c.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=100&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/415754/original/file-20210811-21-1gkg71c.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=126&fit=crop&dpr=1 754w, https://images.theconversation.com/files/415754/original/file-20210811-21-1gkg71c.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=126&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/415754/original/file-20210811-21-1gkg71c.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=126&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">ERA data were once used to inform government funding allocations, but funding no longer mentioned on the website.</span>
<span class="attribution"><a class="source" href="https://web.archive.org/web/20190227231533/https://www.arc.gov.au/excellence-research-australia">Wayback Machine archives</a></span>
</figcaption>
</figure>
<p>In 2015, with a clear focus on incentivising performance and simplifying funding, the government introduced revised <a href="https://www.dese.gov.au/review-research-policy-and-funding-arrangements">research block grants</a>. In the process, it overlooked the very exercise that identifies research excellence and so ought to inform performance-based funding. </p>
<p>Since then, the best the government has been able to come up with is adding national benchmarking standards for research to the <a href="https://www.teqsa.gov.au/overview-changes">Higher Education Standards Framework</a>. But with the bar set so low and no apparent reward for institutions that perform well above the required standards, barely an eyelid has been batted over this change.</p>
<h2>‘Informing’ without evidence of use</h2>
<p>Returning to the review committee, its <a href="https://www.arc.gov.au/excellence-research-australia/era-ei-review">final report</a> of June 2021 acknowledged the vision for and objectives of ERA required rethinking, as these had lost their relevance or failed. This included the objectives of providing a stocktake of Australian research and identifying emerging research areas and opportunities for development. </p>
<p>But the committee has danced around the issue of ERA’s utility. It issued a lofty <a href="https://www.arc.gov.au/file/12022/download?token=M1zSgd5Y">vision statement</a>:</p>
<blockquote>
<p>“that rigorous and transparent research assessment informs and promotes Australian universities’ pursuit of research that is excellent, engaged with community, industry and government, and delivers social, economic, environmental and cultural impact.” </p>
</blockquote>
<p>The ARC has adopted it as part of the <a href="https://online.flippingbook.com/view/52483419/">ERA and EI Action Plan</a>.</p>
<p>The notion of “informing” as a buzzword for influence and utility has been the consistent feature of ERA. It seems this will continue. The review committee’s report contains over 50 references to this idea. And “informing decisions” is to be one of the four objectives taken up by the ARC, specifically to “provide a rich and robust source of information on university excellence and activity to inform and support the needs of university, industry, government and community stakeholders”. </p>
<p>But no evidence has ever been provided of ERA’s usefulness to these sectors. This objective rings hollow, particularly in light of the conspicuous absence of industry or government responses to the review. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/unis-want-research-shared-widely-so-why-dont-they-properly-back-academics-to-do-it-151375">Unis want research shared widely. So why don't they properly back academics to do it?</a>
</strong>
</em>
</p>
<hr>
<figure class="align-center ">
<img alt="Entomologist looks at netting with lights to attract insects in the dark" src="https://images.theconversation.com/files/415176/original/file-20210809-17-1e7ls2a.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/415176/original/file-20210809-17-1e7ls2a.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/415176/original/file-20210809-17-1e7ls2a.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/415176/original/file-20210809-17-1e7ls2a.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/415176/original/file-20210809-17-1e7ls2a.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/415176/original/file-20210809-17-1e7ls2a.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/415176/original/file-20210809-17-1e7ls2a.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The ERA process has produced no clear evidence of how university research is being used.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/entomologists-insect-scientists-collecting-moths-beetles-1326638879">Shutterstock</a></span>
</figcaption>
</figure>
<h2>The vanishing link to funding</h2>
<p>Of course, the really big question is whether ERA and EI will ever inform research funding. That’s something the ARC has brought up over the years, and possibly the only reason why universities are so compliant. </p>
<p>Curiously, though, the review’s terms of reference did not cover this issue. Perhaps, after 11 years, no one can work this out. Now that would surely represent a very poor return on investment.</p><img src="https://counter.theconversation.com/content/165622/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ksenia Sawczak does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>After 11 years of Excellence in Research for Australia, the time and costs for universities and the value it creates for other sectors (none of which made submissions to a recent review) are unknown.Ksenia Sawczak, Head, Research and Development, Faculty of Arts and Social Sciences, University of SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1081502018-12-13T11:45:44Z2018-12-13T11:45:44ZIn 2019, women’s rights are still not explicitly recognized in US Constitution<p>Over nine decades, efforts to amend the U.S. Constitution to recognize women’s rights have faced major challenges. </p>
<p>Congress finally passed such legislation, known as the Equal Rights Amendment, in 1972. The <a href="https://www.gpo.gov/fdsys/pkg/STATUTE-86/pdf/STATUTE-86-Pg1523.pdf">amendment</a> would recognize women’s equal rights to men under the law. </p>
<p>Despite concerted campaigns by women’s rights groups, it fell short of the 38 states that needed to ratify it in order for it to become part of the Constitution. The original deadline for states to ratify was 1979. Congress extended the deadline to 1982, but even then it still fell three states short of <a href="https://www.history.com/this-day-in-history/equal-rights-amendment-passed-by-congress">passage</a>. </p>
<p>Nevertheless, women’s rights activists have continued working to get states to <a href="https://now.org/resource/chronology-of-the-equal-rights-amendment-1923-1996/">ratify</a> it. </p>
<p>Many ERA proponents argue that the deadline is irrelevant because the 27th Amendment to the Constitution, which prohibits changes to the salaries of congressional legislators, was <a href="https://fas.org/sgp/crs/misc/R42979.pdf">ratified in 1992</a>, 203 years after it was introduced. The same could happen to the ERA, they argue. They maintain that <a href="https://www.nytimes.com/2018/05/31/us/equal-rights-amendment-illinois.html">Congress</a> has the power to change the deadline and recognize the 38 ratification votes to approve the amendment.</p>
<p>Some <a href="https://www.chicagotribune.com/news/local/politics/ct-met-equal-rights-amendment-illinois-20180530-story.html">constitutional experts</a>, however, argue that it may be too late, since the deadline passed more than three decades ago. They also suggest that, while its passage would have symbolic importance, the ERA might only make a difference at the margins where the law still allows sex discrimination. </p>
<p>I’m a scholar who studies <a href="https://www.cambridge.org/us/academic/subjects/sociology/political-sociology/abortion-politics-mass-media-and-social-movements-america?format=HB&isbn=9781107069237">gender</a> and <a href="https://journals.sagepub.com/doi/abs/10.1177/2329496515603726?casa_token=4HXIJlECyQQAAAAA%3AY9b0XOGxgif8EinuzkdxBcW53F80hF0khTztRdnu3Kx6DxC5I0_Nou7RiY8K3KsLxdIk6QgaxWyb">politics</a>. Here’s a quick summary of how the country got to this point and the barriers that still exist to adding the Equal Rights Amendment to the Constitution.</p>
<h2>‘Ladies against women’</h2>
<p>Women’s rights advocates argue that sex discrimination is a pervasive problem that could be resolved by the ERA. Even though the Equal Protection Clause in the <a href="https://www.loc.gov/rr/program/bib/ourdocs/14thamendment.html">14th Amendment</a> prohibits states from denying any person equal protection under the law, women’s rights are not explicitly guaranteed.</p>
<p>The push for equal rights heated up in the 1920s after women gained the right to vote. <a href="https://catalog.hathitrust.org/Record/012280242">Alice Paul</a>, a suffragette, proposed the first version of an Equal Rights Amendment in 1923. The proposal was adopted and turned into proposed legislation by two Kansas Republicans, Sen. Charles Curtis and Rep. Daniel Anthony Jr., and was brought up during every congressional session between 1923 and 1971 without success.</p>
<p>The idea of an Equal Rights Amendment, however, gained momentum among politicians and the broader public. <a href="https://bepl.ent.sirsi.net/client/en_US/default/search/detailnonmodal/ent:$002f$002fSD_ILS$002f0$002fSD_ILS:316728/ada">World War II</a> opened many doors for women, who filled gaps in the labor force while men were off fighting. During this time, women were welcomed into politics, onto juries, openly wooed by educational institutions and encouraged to take up male-dominated majors such as math, science and technology.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/250374/original/file-20181213-110253-1yhjz22.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/250374/original/file-20181213-110253-1yhjz22.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/250374/original/file-20181213-110253-1yhjz22.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=390&fit=crop&dpr=1 600w, https://images.theconversation.com/files/250374/original/file-20181213-110253-1yhjz22.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=390&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/250374/original/file-20181213-110253-1yhjz22.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=390&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/250374/original/file-20181213-110253-1yhjz22.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=490&fit=crop&dpr=1 754w, https://images.theconversation.com/files/250374/original/file-20181213-110253-1yhjz22.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=490&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/250374/original/file-20181213-110253-1yhjz22.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=490&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Wilma Scott Heide, center, president of the National Organization for Women, aka NOW, appears with two other women at a press conference in Washington, D.C., Feb. 17, 1973.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Watchf-AP-A-DC-USA-APHS459194-Equal-Rights-Amen-/e662fac2a9a248c4816a647546f63ffa/1/0">AP Photo/Jim Palmer</a></span>
</figcaption>
</figure>
<p>By 1970, the Equal Rights Amendment had been endorsed by four sitting presidents – Republicans Dwight D. Eisenhower and Richard Nixon, and Democrats John F. Kennedy and Lyndon Johnson. The fledgling feminist group, the <a href="http://www.cornellpress.cornell.edu/book/?GCOI=80140100290360">National Organization for Women</a>, adopted the passage of the ERA in its 1967 Bill of Rights for Women and began staging massive demonstrations and lobbying politicians in the late 1960s and early 1970s in an effort to get Congress to pass the amendment. </p>
<p>Finally, <a href="https://fas.org/sgp/crs/misc/R42979.pdf">in 1972</a>, the ERA passed both houses of Congress. The Amendment would have seven years to be ratified by three-fourths, or 38, of the 50 states.</p>
<p>While 30 states ratified the ERA in 1972 and 1973, the amendment ultimately came up three states short of approval by the 1979 deadline. </p>
<p>This was in large part due to the efforts of conservative women’s organizations such as <a href="https://www.press.uchicago.edu/ucp/books/book/chicago/W/bo5977742.html">Eagle Forum</a> and <a href="https://www.cambridge.org/us/academic/subjects/sociology/political-sociology/abortion-politics-mass-media-and-social-movements-america?format=HB&isbn=9781107069237">Concerned Women for America</a> that opposed it. Conservative women <a href="https://academic.oup.com/socpro/article-abstract/32/4/348/1734706">regarded the ERA</a> as a <a href="https://theconversation.com/could-the-era-pass-in-the-metoo-era-87901">threat</a> to the family and child-rearing because it would disrupt traditional gender roles. They also believed women would lose, among other things, their exemptions from the draft and combat duty.</p>
<p>States such as Illinois and Florida became battlegrounds for liberal and conservative women fighting over the amendment. Feminists successfully lobbied Congress to extend the ERA’s ratification deadline to June 30, 1982. The ERA, however, was not ratified by the three states needed to ensure its passage. In 1982, conservative women proclaimed the Equal Rights Amendment officially dead. </p>
<h2>Another chance?</h2>
<p>A number of recent events have put the ERA back on the political agenda: high-profile allegations of sexual assault, the #MeToo movement and, among other issues, increasing restrictions on women’s access to abortion. </p>
<p>Since 2017, two more states – <a href="https://www.npr.org/sections/thetwo-way/2018/05/31/615832255/one-more-to-go-illinois-ratifies-equal-rights-amendment">Nevada and Illinois</a> – have ratified the Equal Rights Amendment. Supporters are now <a href="https://www.washingtonpost.com/local/the-equal-rights-amendment-has-languished-for-decades-virginia-must-put-it-over-the-top/2018/11/29/c454c8f4-f3f0-11e8-80d0-f7e1948d55f4_story.html?utm_term=.11b907510481">rallying support in Virginia</a>, hoping it will the next and final state to ratify it in 2019.</p>
<p>At the same time, for a number of reasons, <a href="https://history.nebraska.gov/blog/nebraskas-again-again-relationship-equal-rights-amendment">Nebraska</a>, Tennessee, Idaho, South Dakota and Kentucky rescinded their ERA ratifications between 1972 and 1982. Some state legislators argued that the amendment was <a href="https://www.press.uchicago.edu/ucp/books/book/chicago/W/bo5977742.html">too controversial</a> given its potential to upend traditional gender roles and legalize what they called “abortion on demand.”</p>
<p>So, even if Virginia legislators ratify the amendment, the fate of the Equal Rights Amendment is unclear.</p>
<p>The Supreme Court could weigh in on whether these reversals should impact the amendment’s addition to the Constitution. But, it is not clear that it would. In fact, the Supreme Court opted not to rule on a rescinded ratification in 1939 on the <a href="https://supreme.justia.com/cases/federal/us/307/474/">Child Labor Amendment</a> whose ratification period had expired. </p>
<p>Likewise, it’s unclear how Congress will respond since the amendment expired decades ago. Congress certainly has the power to ignore the five rescinded ratifications – it has done so <a href="https://www.washingtonpost.com/news/monkey-cage/wp/2018/06/20/the-equal-rights-amendment-is-one-state-from-ratification-now-what/?utm_term=.b5ceadc9c3c7">in the past</a>. But, in a highly polarized political environment, that may prove difficult.</p><img src="https://counter.theconversation.com/content/108150/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Deana Rohlinger does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>In the #MeToo era and with more women entering Congress, activists are hopeful another state could ratify the Equal Rights Amendment. But is it too late?Deana Rohlinger, Professor of Sociology, Florida State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/567452016-04-10T20:02:33Z2016-04-10T20:02:33ZWhen measuring research, we must remember that ‘engagement’ and ‘impact’ are not the same thing<figure><img src="https://images.theconversation.com/files/117931/original/image-20160408-23914-15ysns8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">What is the purpose of measuring engagement, impact or quality?</span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p>In the <a href="https://theconversation.com/turnbull-seeks-ideas-boom-with-innovation-agenda-experts-react-51892">Innovation Statement</a> late last year, the federal government indicated a strong belief that<a href="https://theconversation.com/ten-rules-for-successful-research-collaboration-53826"> more collaboration</a> should occur between industry and university researchers. </p>
<p>At the same time, <a href="https://docs.education.gov.au/system/files/doc/other/20151203_main_report1.pdf">government</a>, <a href="https://www.go8.edu.au/programs-and-fellowships/excellence-innovation-australia-eia-trial">education</a> and industry groupings have made numerous recommendations for the “impact” of university research to be assessed alongside or in addition to the existing assessment of the quality of research. </p>
<h2>How should we measure research?</h2>
<p>But what should we measure and, more importantly, why should we measure it?</p>
<p>In accounting, we stress that the measurement basis of something inevitably reflects the purpose for which that measure is to be used. </p>
<p>So what is the purpose of measuring engagement, <a href="https://theconversation.com/there-is-no-easy-way-to-measure-the-impact-of-university-research-on-society-50856">impact</a> or, for that matter, quality? </p>
<p>The primary reason for measuring quality seems fairly self-evident – as a major stakeholder in terms of funding (especially dedicated research-only funding), the government wants an assessment of just “how good” by academic standards such research really is. </p>
<p>Looking ahead, measures of quality such as the Excellence in Research for Australia (ERA) rankings have been speculated to potentially influence future funding via prestigious competitive schemes (such as the Australian Research Council), block funding for infrastructure and the availability of government support for doctoral students via Australian Postgraduate Awards. </p>
<p>So the demand for a measure of research quality and the potential uses of such a measure are pretty clear.</p>
<p>But what valid reasons are there for investing significant resources in the measurement of research impact or engagement? </p>
<p>If high-quality research addresses important practical problems (large or small), surely we would expect impact would follow? </p>
<p>In this sense, the extent of impact is really a joint product of the quality (or robustness) of research and the choice of topic (ie, practical versus more esoteric).</p>
<h2>Research impact needs time</h2>
<p>But over what period should impact be measured? </p>
<p><a href="https://www.go8.edu.au/programs-and-fellowships/excellence-innovation-australia-eia-trial">Recent exercises</a> such as that conducted by the Australian Technology Network and Group of Eight have a relatively short-term focus, as would any “impact assessment” tied to the corresponding period covered by the existing ERA time frame (say the last six years). </p>
<p>I and many others maintain that impact can only be assessed over much longer periods, and that in many cases short-term impact is potentially misleading. </p>
<p>How often have supposedly impactful results subsequently been rejected or overturned? </p>
<p>Such examples inevitably turn out to reflect low quality (and in some cases outright fraudulent) research.</p>
<h2>Ranking impact</h2>
<p>Finally, how can impact be ranked? Is there a viable measure that can distinguish between high and low impact? Existing case-study approaches are unlikely to yield any form of quantifiable measurement of research impact.</p>
<p>Equally puzzling is the call to measure research engagement. What is the purpose of such an exercise? Surely in a financially constrained research environment, universities readily recognise the importance of such engagement and pursue it constantly. </p>
<p>We don’t need a national assessment of engagement to encourage universities to engage. </p>
<p>Motive aside, one approach canvassed is the quantum of non-government investment in research (ie, non-government research income). </p>
<p>This is arguably one rather limited way to measure engagement, and is focused on input rather than output. If the purpose of any measurement is to capture outcomes, does it make sense to focus exclusively on inputs? The logic of this escapes me.</p>
<h2>Engagement and impact are not the same thing</h2>
<p>Even more worryingly, some use the terms engagement and impact interchangeably. </p>
<p>They would have us believe that a simple (but useful) measure of impact is the extent to which university researchers receive industry funding. Surely this is, at best, a measure of engagement, not impact.</p>
<p>Although the two are likely correlated, the extent will vary greatly across discipline areas. </p>
<p>Further, in business disciplines, much of the “knowledge transfer” that occurs via education (including areas such as executive programs) reflects the impact of the constant process of researching better business practices across areas such as accounting, finance, economics, marketing and so on.</p>
<p>Discretionary expenditure on such programs by business is surely an indication of the extent to which business schools and industry are engaged, yet this would be ignored if we focused on research income alone.</p>
<p>We must not lose sight that quality (ie, rigour and innovativeness) is a necessary but not sufficient condition for broader research impact.</p>
<p>Engagement is not impact, and simple measures such as non-government research income tell us very little about genuine external engagement between universities and industry.</p>
<p>As accountants know, performance measurement reflects its purpose. What we need before any further national assessment of attributes such as impact or engagement is clear understanding of the purpose of such an exercise. </p>
<p>Only when the purpose is clearly specified can we have a sensible debate about measurement principles.</p><img src="https://counter.theconversation.com/content/56745/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Stephen Taylor is affiliated with the Australian Business Dean's Council as the 2016 ABDC Research Scholar </span></em></p>Engagement is not impact, and simple measures such as non-government research income tell us very little about genuine external engagement between universities and industry.Stephen Taylor, Professor of Accounting, University of Technology SydneyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/521522015-12-14T19:26:24Z2015-12-14T19:26:24ZWill the impact framework fix the problems the research audit found?<figure><img src="https://images.theconversation.com/files/105367/original/image-20151211-8291-8gyfh3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How useful is ERA for measuring research quality?</span> <span class="attribution"><span class="source">from www.shutterstock.com</span></span></figcaption></figure><p>The results from the latest university research audit indicate that <a href="https://theconversation.com/are-australian-universities-getting-better-at-research-or-at-gaming-the-system-51895">research in Australia is improving</a>.</p>
<p>This suggests that the Excellence in Research for Australia (ERA) exercise is working: ERA has achieved its main aim of boosting the quality of Australian research.</p>
<p>However, this headline statement masks a plethora of concerns.</p>
<p>Under the <a href="https://theconversation.com/turnbull-seeks-ideas-boom-with-innovation-agenda-experts-react-51892">government’s latest reform</a> of research funding, academics will be assessed not only on their quality of research through the ERA, but also on the economic, social and environmental impacts of their research through <a href="https://theconversation.com/watt-report-suggests-financial-incentives-for-measuring-research-impact-51815">a new impact framework</a></p>
<p>The impact and engagement measures herald a new era that rewards researchers for collaborating beyond their institutions.</p>
<p>It is timely, then, to reassess ERA’s utility. Is it fit for purpose? Will these two assessment systems complement or contradict one another?</p>
<h2>What has gone well in ERA?</h2>
<p>The ERA processes have recognised peer review alongside metrics. </p>
<p>Research efforts at universities are arguably now more focused towards areas of strength. There is a clearer (though contested and arguably narrower) understanding of scholarly research, particularly that which is non-traditional. </p>
<p>On paper, ERA has established a system whereby research can be compared nationally and against international benchmarks.</p>
<h2>What isn’t working?</h2>
<p>Individual researchers are not assessed by ERA per se. However, they are assessed in line with ERA at the institutional level — in a system that awards a single score for an entire discipline cohort.</p>
<p>Inter-disciplinary research has been disadvantaged. ERA’s 1,238 fields of research (FoR) codes make it problematic for researchers to publish outside their discipline or academic unit. </p>
<p>Publishing, performing or exhibiting internationally is perceived to be more prestigious than in Australia. This unjustified exoticism diminishes the importance of Australian research and puts local and Australian publication outlets at risk. </p>
<p>A lack of transparency and accountability remains a critical problem. </p>
<p>The process by which final rankings are calculated remains opaque. It is unclear how the peer review of evaluation units is moderated and benchmarked globally. The rationale for inclusion, exclusion and change in the list of journals recognised by ERA has not been made public. </p>
<p>Whole disciplines ranked “below world average” are reliant on empirical research to fathom what went wrong. There is no feedback other than the score.</p>
<p>Esteem measures are narrow. The category “prestigious work of reference”, for example, is strikingly limited. It has never been opened to public discussion. Why have some publications been chosen and others omitted?</p>
<p>The ERA journal rankings were <a href="https://theconversation.com/why-the-era-had-to-change-and-what-we-should-do-next-1874">abolished in 2011</a>. However, their ghost influences decisions from journal selection to academic recruitment and promotion. </p>
<p>Universities still reward publication in high-ranking journals from the list; some institutions recognise only research published in A or A<sup>*</sup> journals, or those marked “quality” in the current list. </p>
<p><a href="http://link.springer.com/chapter/10.1007/978-94-007-7085-0_22">As predicted</a>, the editorial boards of these journals are struggling to cope with the influx of submissions. Lower-ranked journals and those with lower impact factors are struggling to survive. Many <a href="http://www.australianhumanitiesreview.org/archive/Issue-May-2009/genoni&haddow.htm">Australian journals</a> are disadvantaged by the bias towards international journals. </p>
<p>The audit culture most affects early career academics. They and others <a href="http://files.eric.ed.gov/fulltext/EJ926450.pdf">struggle</a> to negotiate the system, juggle heavy <a href="http://www-tandfonline-com.dbgw.lis.curtin.edu.au/doi/pdf/10.1080/07294360.2013.864616">teaching loads</a> and manage the <a href="https://minerva-access.unimelb.edu.au/bitstream/handle/11343/28917/264644_GoedegebuureAustraliasCasual.pdf?sequence=1">precarity</a> of casual academic employment. </p>
<p>The <a href="http://www.lhmartininstitute.edu.au/userfiles/files/research/attractiveness_ac_prof_res_brief.pdf">international mobility of Australian academics</a> is high and early career academics are the most likely to <a href="http://www.cshe.unimelb.edu.au/people/bexley_docs/The_Academic_Profession_in_Transition_Sept2011.pdf">move overseas or leave higher education</a>.</p>
<p>The loss of young academics from an ageing academic workforce risks Australia’s ability to meet future demand. Moreover, it impairs capacity for innovation.</p>
<h2>What are the concerns?</h2>
<p>Measuring engagement according to <a href="https://theconversation.com/australias-innovation-agenda-embracing-risk-or-gambling-with-public-health-52003">research income from industry</a> is concerning. </p>
<p>How, for example, will <a href="https://theconversation.com/in-universities-obsessed-with-research-heres-what-falls-between-the-cracks-938">collaborative research with not-for-profits</a> and innovative start-up companies be measured? How will the new measures account for these organisations’ exemptions from a cash contribution for Australian Research Council Linkage proposals? </p>
<p>There is a contradiction between a new impact measure that encourages a culture of risk-taking and ERA, which promotes risk-avoidance behaviours and impacts upon <a href="http://www.victoria.ac.nz/law/research/publications/vuwlr/prev-issues/volume-44,-issue-34/07-Butler.pdf">academic freedom</a> by directing research behaviour. This is particularly problematic for new researchers, blue-sky research and research with benefits that emerge only in the long term. </p>
<p>Both systems place professional service outside academic workloads. This raises new questions. Who will edit the journals, convene the conferences, become officers of professional associations, or write the handbooks and textbooks?</p>
<p>These activities are essential to the health of all disciplines. Increasingly, they are unrecognised and unrewarded. This has long-term ramifications for both research quality and impact. </p>
<p>Neither system recognises investments in partner communities that are critical to social licence to operate in many disciplines. </p>
<h2>Improving ERA</h2>
<p>Has ERA run its course? Perhaps. It certainly needs improvement. </p>
<p>The ERA process should be subject to external review. We need greater transparency about the criteria that inform assessment categories. We need discussion of categories not yet opened to consultation. </p>
<p>Given concerns <a href="https://theconversation.com/are-australian-universities-getting-better-at-research-or-at-gaming-the-system-51895">over gaming the system</a>, we need an audit of data that has been excluded from ERA submissions. There should be a review of disciplinary membership of the committees in terms of institutional representation through time.</p>
<p>We need ERA to cease peer reviews of outputs already subject to double-blind peer review.</p>
<p>There is a dire need to review the real cost of each ERA exercise, which runs approximately every three years. We need to consider whether the costs of assessing research excellence <a href="http://rev.oxfordjournals.org/content/20/3/247.short">exceed the benefits</a>. </p>
<p>While the ARC’s <a href="http://www.arc.gov.au/sites/default/files/filedepot/Public/ARC/Budget/PDF/2015_16%20ARC_Budget_Statements.pdf">administrative and departmental costs</a> are low, we also need to assess <a href="https://theconversation.com/the-era-assessed-cost-not-rated-and-league-tables-is-there-a-better-way-to-do-it-51865">the costs of university compliance</a> and of playing an effective strategic assessment game.</p>
<p>The new impact and engagement measures redress some of ERA’s deficiencies, but the challenges of cost, transparency, audit culture and external oversight remain. And teaching remains out in the cold.</p><img src="https://counter.theconversation.com/content/52152/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>In 2007 Claire Smith was a member of the Humanities Assessment Panel for the Australian Government's short-lived Research Quality Framework. From 2009-2011 she was a member of the Humanities and Creative Arts Panel, College of Experts, Australian Research Council.</span></em></p><p class="fine-print"><em><span>Dawn Bennett is an ARC Peer Reviewer and an ARC Assessor.</span></em></p>The new impact framework will improve some of the problems arising out of the ERA’s university research audit, but major challenges will remain.Claire Smith, Professor of Archaeology, Flinders UniversityDawn Bennett, Research Professor, Curtin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/518952015-12-07T19:10:19Z2015-12-07T19:10:19ZAre Australian universities getting better at research or at gaming the system?<figure><img src="https://images.theconversation.com/files/104610/original/image-20151207-22680-1agyu1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Time to make research audits more transparent?</span> <span class="attribution"><span class="source">www.shutterstock.com</span></span></figcaption></figure><p>University research in Australia is improving, according to the latest <a href="http://www.arc.gov.au/news-media/media-releases/research-excellence-innovative-future-era-2015-results-released">round of results</a> from the Excellence in Research for Australia (ERA) audit.</p>
<p>Every two to three years the ERA reviews hundreds of thousands of research papers from researchers in universities across Australia. </p>
<p>For each university, research in each field (such as psychology, chemistry, medicine and history) is rated from one to five stars, the later being what all universities strive for – “well above world-standard” research. These results determine how much research funding universities receive. It’s a big deal for institutions.</p>
<p>On the surface, it may look like the ERA exercise <a href="http://www.theaustralian.com.au/higher-education/eras-rising-tide/story-e6frgcjx-1227633436699">has achieved</a> what it set out to do – improve Australia’s collective research performance. </p>
<p>However, <a href="http://socialscience.uq.edu.au/governing-performance">research that I have been leading</a> over the past five years – examining performance measurement in publicly funded services, including the ERA – suggests that we should be wary about how these results are being produced.</p>
<h2>Growth, gaming or fraud?</h2>
<p>I have become acquainted with the various pressures, professional responses and governance practices operating in universities from individual academics, to teams, to units, to executives.</p>
<p>My research shows that strategic gaming and what could appear to be fraud is systematically happening in universities as part of ERA processes. Universities construct submissions by allocating publications to fields of research (FoR) to demonstrate high research quality and quantity.</p>
<p>Consider the following cases:</p>
<p>In one university, one senior science executive explained that they performed so strongly in one field that they reclassified “surplus” publications to another field with the hope of increasing the second field’s ranking. For example, research in civil engineering might be reclassified as chemical engineering.</p>
<p>In another university, almost one half of research papers submitted for a professional discipline were not authored by members of that profession or in journals associated with that profession. The strategy was to artificially increase the size of research activity in a field to enhance their ERA rank. This is not uncommon. I’m aware that this practice has also been used by universities in the social sciences.</p>
<p>The ERA assesses research fields, not institutional departments. In another case, a university submitted research on the basis of a department in which it was undertaken, not the field of research it contributed to.</p>
<p>These strategic gaming practices are, however, not without risk. </p>
<p>The ERA rules limit the level of shifting of journal articles by linking FoR codes to journal titles, but still leaves considerable space for institutional discretion, particularly in books and research funding. Submitted data must also be scrutinised by ERA assessors and research evaluation committees.</p>
<p>In some cases, such gaming strategies have been detected by ERA processes. The ARC <a href="http://www.smh.com.au/national/education/universities-questioned-over-alleged-gaming-of-research-rankings-20151117-gl0yva.html">reportedly sent “please explains”</a> to several universities. </p>
<p>I am also aware that some ERA evaluators did not reward reallocating research publications into a different discipline to increase its apparent size. However, sometimes the strategy pays off, with one institution receiving a five in a “gamed” research field.</p>
<p>To pretend strategic gaming does not happen – or that it will be discovered and punished, or is of no consequence – is sheer nonsense. </p>
<p>The creators of the ERA must think critically about what it actually is doing in Australia’s universities, and whether the many millions of dollars to run it are worth the cost.</p>
<p>The real question to the education minister, his department and the Australian Research Council (ARC), is what they will do about it?</p>
<h2>How to move forward: make the process open to the public</h2>
<p>One approach is to reduce the capacity for gaming within ERA processes. </p>
<p>A way to do this could be for the ARC to make universities’ ERA submissions publicly available. </p>
<p>At present, the ERA submissions are confidential and typically the submissions are created within institutions by executives and administrators with no accountability to the very researchers whose research performance data they manage, massage and submit. </p>
<p>Such transparency would provide external checks by academics who have a personal interest in their own discipline, and not the disciplines administrators deem their research to be strategically useful for. </p>
<p>It would also enable public shaming of institutions which cannot publicly justify ERA submissions.</p>
<h2>Apply stricter rules for submitting research</h2>
<p>Another option is to provide much stricter rules for allocating ERA input, such as only allowing publications to be submitted according to journals’ FoR codes or to authors’ self-identified FoR code.</p>
<p>Similarly, ERA rules could ensure that research funding can only be submitted into the fields of the investigators.</p>
<p>Academics and their unions should also be allowed to challenge the internal secrecy that typically operates within universities in the preparation of ERA submissions. Given the rise of the <a href="https://theconversation.com/why-australia-needs-a-new-model-for-universities-43696">corporate managerial university</a>, such an approach seems unlikely to gather much momentum.</p>
<h2>How valuable is the ERA exercise?</h2>
<p>A third approach is to question the value of the ERA exercise and to find new ways in which to enhance collective research quality and assessment. </p>
<p>The ERA process involves many millions of dollars. It also involves thousands of hours from academics in preparing and reviewing ERA submissions.</p>
<p>With all the data currently out there, how useful is the ERA process?</p>
<p>An alternative approach would be for the Australian government to require universities to systematically, publicly and regularly report their research inputs and outputs in a standardised format. </p>
<p>The ARC could commission research to analyse this publicly available data at a fraction of the cost of the ERA and under the quality control of academic peer review. This approach is much more suited to a 21st-century open government.</p><img src="https://counter.theconversation.com/content/51895/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Paul Henman works for The University of Queensland, which is assessed under the ERA exercise. He has received research funding from the ARC which manages the ERA. </span></em></p>Making the whole process of auditing research open to the public could help reduce the capacity for universities to game the system.Paul Henman, Associate Professor, Sociology and Social Policy, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/510362015-11-29T19:15:31Z2015-11-29T19:15:31ZThe ‘lucky country’ needs to be re-invented<figure><img src="https://images.theconversation.com/files/103265/original/image-20151126-11998-1hqcl77.jpg?ixlib=rb-1.1.0&rect=17%2C0%2C4449%2C3259&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">We need researchers to collaborate with industry if we're to be an innovation nation.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Despite a lot of talk about the importance of science, technology, engineering and maths (<a href="https://theconversation.com/au/topics/stem">STEM</a>), technical innovation and improved interaction between the research universities and industry, little has improved in recent years.</p>
<p>In our vision for innovation we see changes that drive the kind of industry-academia based interaction Australia needs. Tax changes, a restructure of how universities are funded, broader training for post-graduate students to include industry engagement and changes to some anachronistic institutions such as CSIRO. </p>
<p>It was into this situation that the Industry Mentoring Network in STEM (<a href="http://www.atse.org.au/content/industry-mentoring-network-in-stem-imnis.aspx">IMNIS</a>) program was founded last year, as an Australian Academy of Technology and Engineering (<a href="https://www.atse.org.au/">ATSE</a>) initiative. IMNIS was founded as a scalable, inexpensive intervention to enhance the commercial knowledge and focus of STEM post-graduates, thus enabling long-term cultural change.</p>
<p>IMNIS matches senior PhD students with successful industry mentors, and has run pilot programs in biotechnology in Victoria, and minerals and energy in Western Australia over the past six months. So far the student uptake and mentor involvement has been enthusiastic.</p>
<p>IMNIS is not the only program to educate graduates in commercial science, although its focus on networking and widespread volunteer mentoring does differ from training or placement schemes. </p>
<p>While IMNIS should help, and will certainly help individuals, all the schemes are collectively only a small part of the cultural change necessary for a technically innovative society. </p>
<h2>Chasing excellence</h2>
<p>We now have a university business model and research incentive scheme that does little to reward either university or industry for cross fertilisation. Research university business models are based largely on selling education to international fee-paying students. </p>
<p>Meanwhile, the incentive for industry collaboration is low, whereas the drive to achieve a high rank in the Excellence in Research for Australia <a href="http://www.arc.gov.au/excellence-research-australia">ERA</a> scheme, which favours research publications over industry collaboration, is high.</p>
<p>So what is to be done? Many are suggesting solutions. ATSE has commented on the ERA structure, and everybody believes quality research should be rewarded. On the other hand, the ERA is now a bureaucracy in itself. </p>
<p>Every university group has a view on ERA and quality research outcomes, but what is “quality” in research anyway? Right now it is mostly a self-referencing system: quality research is publication in quality journals, and citation by others in said quality journals, a virtuous circle. </p>
<p>Quality research defined this way is supported by the Australian Research Council (<a href="http://www.arc.gov.au/">ARC</a>), National Health and Medical Research Council (<a href="https://www.nhmrc.gov.au/">NHMRC</a>), and obtaining funding from these sources – but not commercial sources – is also defined as a quality research measure, hence the circular virtue for those receiving grant funding. </p>
<h2>R&D tax incentive</h2>
<p>Over the past decade we have seen an erosion of industry directed support programs at federal and in some cases state level. The one incentive that has demonstrably delivered, even for small companies, is the <a href="http://www.business.gov.au/grants-and-assistance/innovation-rd/RD-TaxIncentive/Pages/default.aspx">R&D tax incentive</a>.</p>
<p>The previous scheme rewarded big business such as mining and banking and ignored the small entrepreneurial companies and new industries. The current R&D scheme rewards small companies and provides needed non equity diluting capital for R&D. </p>
<p><a href="http://www.ausbiotech.org/">AusBiotech</a>, the industry body for the Australian biotechnology and life sciences sector, has many examples of companies that brought clinical trials to Australia and have reached their value inflection point sooner and been able to invest more intensively in research as a result. This is ultimately good for the waiting consumer and good for the economy.</p>
<p>However, not all recipients fulfil the policy intent. Some large companies are “gaming the system” by reframing “business as usual” activities as research. So if we’re to maximise the benefit from the scheme for the country, we need to tighten eligibility criteria to ensure the purpose of the incentive is delivered. </p>
<p>What history has shown is that for all tax, education and grant schemes, we have become the crafty country, across the spectrum of society. For industry, the definition of research appears to have become very wide and in education allowable taxpayer financed education has reached ludicrous levels.</p>
<p>One suggestion is that the R&D incentive should be structured to encourage industry to collaborate and fund universities. But why should industry do this? What makes university research more beneficial than projects run by industry? Universities are all “R” and little “D”, even if they have tried to access more funding by claiming translational activities. </p>
<p>Instead of more special pleading, we need real commitment to collaboration.</p>
<h2>Post-graduate training</h2>
<p>With new data indicating that only 10% of current PhD students in STEM will gain a permanent academic position, we need urgent changes to broaden graduate opportunities. Australian PhD graduates are well regarded overseas for their scientific training yet they have little knowledge of industry. </p>
<p>Industry will hire these bright young scientists for their talent yet could end up firing them for their personal style if they cannot work in teams and communicate effectively.</p>
<p>Universities are now rushing to <a href="https://go8.edu.au/publication/discussion-paper-changing-phd">address these issues</a>. But the university sector is not rewarded for industry collaboration, nor does it really value it. </p>
<p>Universities largely subsidised the on cost of ARC/NHMRC grants, yet they seek “commercial” rates for industry collaborations. And why not, as only the former has ERA and ranking value? Yet the message it sends to university staff and potential industry collaborators is that the university does not consider research done with industry as priority research, it is of lesser value and quality and almost tradesman like. </p>
<p>We cannot build an innovative culture while trapped in a mindset of special pleading, and where purist mentalities prevail like old aristocrats disdaining trade. We will not change without major reform in our structures and thinking, which means for researchers and industry its not about more funding to do more of exactly what they are doing now, without major change to institutional structure and reward.</p>
<h2>Broad reforms</h2>
<p>Transforming to an “innovative culture” will require broad reforms in attitude and an acceptance of change. CSIRO is one organisation that could do with reform in this area. </p>
<p>CSIRO has been lumbering along for years trying many models of operating, and it’s clearly the most logical group for translational research, leaving the pure research for universities.</p>
<p>We advocate for a disaggregated CSIRO, composed of semi-independent commercially-driven institutes, which persist only as they address the needs of Australian industries, with those CSIRO functions of public good folded back into university affiliated institutes.</p>
<p>Outside academia, we can re-examine the R&D incentive to push the creation of new innovative companies, rather than largely just supporting existing players. Targeted capital gains tax relief is another tool that can encourage risk taking research investment in new small to medium enterprises with less off-target leakage to major and multi-national corporations. </p>
<p>We also need reward systems for universities to collaborate with industry, reward systems for staff within those universities that truly value industry collaboration and other community engagement, and universities training students to engage with industry as a natural part of their activities.</p>
<p>It will require a cultural shift, to be sure, but we see the addition of such measures detracts nothing from, and in fact provides resource to, the pure research function of universities.</p>
<hr>
<p><em>This article is drawn from a conversation with Dr Tony Radford (Director IMNIS) and Dr Anna Lavelle (CEO AusBiotech).</em></p>
<p><em>This article is part of our series <strong><a href="https://theconversation.com/au/topics/why-innovation-matters">Why innovation matters</a></strong>. Look out for more articles on the topic in the coming days.</em></p><img src="https://counter.theconversation.com/content/51036/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Paul Wood is a Director of IMNIS. </span></em></p>An emphasis on innovation is great, but we need genuine reforms to universities and tax incentives if we’re to promote collaboration between research and industry.Paul Wood AO, Adjunct Professor in Biotechnology, Monash UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/368952015-02-19T19:35:53Z2015-02-19T19:35:53ZExplainer: how and why is research assessed?<figure><img src="https://images.theconversation.com/files/70794/original/image-20150202-13057-1ex8dnu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Citations, bibliometrics, "publish or perish": why must we constantly assess research?</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Governments and taxpayers deserve to know that their money is being spent on something <a href="http://www.tandfonline.com/doi/full/10.1080/19338244.2015.982002#abstract">worthwhile to society</a>. Individuals and groups who are making the greatest contribution to science and to the community <a href="http://link.springer.com/article/10.1007/BF02019306">deserve to be recognised</a>. For these reasons, all research has to be assessed.</p>
<p>Judging the importance of research is often done by looking at the number of <a href="http://onlinelibrary.wiley.com/doi/10.1002/hfm.20165/abstract">citations</a> a piece of research receives after it has been published.</p>
<p>Let’s say Researcher A figures out something important (such as how to cure a disease). He or she then publishes this information in a scientific journal, which Researcher B reads. Researcher B then does their own experiments and writes up the results in a scientific journal, which refers to the original work of Researcher A. Researcher B has now <a href="http://link.springer.com/article/10.1007%2Fs11192-012-0685-x">cited</a> Researcher A.</p>
<p>Thousands of experiments are conducted around the world each year, but not all of the results are useful. In fact, a lot of scientific research that governments pay for is often ignored after it’s published. For example, of the 38 million scientific articles published between 1900 and 2005, <a href="http://jama.jamanetwork.com/article.aspx?articleid=202114">half were not cited at all</a>.</p>
<p>To ensure the research they are paying for is of use, governments need a way to decide which researchers and topics they should continue to support. Any system should be fair and, ideally, all researchers should be scored using the same measure. </p>
<p>This is why the field of <a href="http://onlinelibrary.wiley.com/doi/10.15252/embr.201439608/full">bibliometrics</a> has become so important in recent years. Bibliometric analysis helps governments to number and rank researchers, making them easier to compare.</p>
<p>Let’s say the disease that Researcher A studies is pretty common, such as cancer, which means that many people are looking at ways to cure it. In the mix now there would be Researchers C, D and E, all publishing their own work on cancer. Governments take notice if, for example, ten people cite the work of Researcher A and only two cite the work of Researcher C.</p>
<p>If everyone in the world who works in the same field as Researcher A gets their research cited on average (say) twice each time they publish, then the international citation benchmark for that topic (in bibliometrics) would be two. The work of Researcher A, with his or her citation rate of ten (five times higher than the world average), is now going to get noticed.</p>
<h2>Excellence for Research in Australia</h2>
<p>Bibliometric analysis and citation benchmarks form a key part of how research is assessed in Australia. The Excellence for Research in Australia (<a href="http://www.arc.gov.au/era/">ERA</a>) process evaluates the quality of research being undertaken at Australian universities against national and international benchmarks. It is administered by the Australian Research Council (<a href="http://www.arc.gov.au/about_arc/default.htm">ARC</a>) and helps the government decide what research is important and what should continue to receive support.</p>
<p>Although these are not the only components assessed in the ERA process, bibliometric data and citation analysis <a href="http://www.arc.gov.au/pdf/ERA15/ERA%202015%20Submission%20Guidelines.pdf">are still a big part</a> of the performance scores that universities and institutions receive.</p>
<p>Many other countries apply formal research assessment systems to universities and have done so for many years. The United Kingdom, for example, operated a process known as the <a href="http://www.rareview.ac.uk/reports/roberts.asp">Research Assessment Exercise</a> between 1986 and 2001. This was superseded by the <a href="http://www.ref.ac.uk/">Research Excellence Framework</a> in 2014.</p>
<p>A bibliometrics-based performance model has also been <a href="http://www.palgrave-journals.com/eps/journal/v8/n3/abs/eps200919a.html">employed in Norway</a> since 2002. This model was first used to influence budget allocations in 2006, based on scientific publications from the previous year.</p>
<p>Although many articles don’t end up getting cited, this doesn’t always mean the research itself didn’t matter. Take, for example, the polio vaccine developed by Albert Sabin last century, <a href="https://www.jstage.jst.go.jp/article/kurumemedj/52/3/52_3_111/_article">which saves over 300,000 lives</a> around the world each year.</p>
<p>Sabin and others <a href="http://jama.jamanetwork.com/article.aspx?articleid=329147">published the main findings</a> in 1960 in what has now become one of the most important scientific articles of all time. By the late 1980s, however, Sabin’s article <a href="http://jama.jamanetwork.com/article.aspx?articleid=363835">had not even been cited 100 times</a>.</p>
<p>On the other hand, we have Oliver Lowry, who in 1951 published an <a href="http://www.jbc.org/content/193/1/265.citation">article describing</a> a new method for measuring the amount of protein in solutions,. This has become the most <a href="http://www.jbc.org/content/280/28/e25.short">highly cited article of all time</a> (over 300,000 citations and counting). Even Lowry was surprised by its “success”, <a href="http://www.annualreviews.org/doi/abs/10.1146/annurev.bi.59.070190.000245">pointing out</a> that he wasn’t really a genius and that this study was by no means his best work.</p>
<h2>The history of research assessment</h2>
<p>While some may regard the assessment of research as a modern phenomenon inspired by a new generation of faceless bean-counters, the concept has been around for centuries.</p>
<p><a href="http://en.wikipedia.org/wiki/Francis_Galton">Sir Francis Galton</a>, a celebrated geneticist and statistician, was probably the first well-known person to examine the performance of individual scientists, publishing a landmark book, <a href="http://galton.org/books/men-science/">English Men of Science</a>, in the 1870s.</p>
<p>Galton’s work evidently inspired others, with an American book, <a href="http://books.google.com.au/books/about/American_Men_of_Science.html?id=IZ9LAAAAMAAJ&redir_esc=y">American Men of Science</a>, appearing in the early 1900s.</p>
<p>Productivity rates for scientists and academics (precursors to today’s performance benchmarks and KPIs) have also existed in one form or another for many years. One of the first performance “benchmarks” appeared in a 1940s book, <a href="http://books.google.com.au/books/about/The_Academic_Man.html?id=CA1CGvPJGtwC&redir_esc=y">The Academic Man</a>, which described the output of American academics.</p>
<p>This book is probably most famous for coining the phrase “publish or perish” - the belief that an academic’s fate is doomed if they don’t get their research published. It’s a fate that bibliometric analysis and other citation benchmarks now reinforce.</p><img src="https://counter.theconversation.com/content/36895/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Derek R. Smith does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Governments and taxpayers deserve to know that their money is being spent on something worthwhile to society. Individuals and groups who are making the greatest contribution to science and to the community…Derek R. Smith, Professor, University of NewcastleLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/83902012-07-24T05:55:41Z2012-07-24T05:55:41ZNo boom without bust: a cautionary note about mining and employment<figure><img src="https://images.theconversation.com/files/13333/original/bsmwbg36-1343093296.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Australia's boom investment conditions will begin tailing off by 2014, according to a Deloitte Access Economics report - so what does this mean for current labour shortages?</span> </figcaption></figure><p>Much public discussion around the current mining boom focuses on the lack of qualified staff to fill an expanding employment market. </p>
<p>But yesterday’s report by Deloitte Access Economics warning that the “peak of the project pipeline is already in sight” and expected to tail off in one to two years, brings into focus a little-discussed topic - the actual nature of economic cycles in mining. </p>
<p>The report says while Australia is still a global standout, it warns that “the strong bit of Australia’s two speed economy may not stay as strong beyond 2014.” </p>
<p>If this report is correct - and there are good reasons to suggest it is accurate - it brings into question whether labour shortages in regional and remote mining areas, particularly in the key states of Western Australia and Queensland, should continue to be the central focus of the discussion of the current mining boom. </p>
<p>If nothing else the Deloitte findings do point to the need for a closer examination of the actual nature of mining booms, and the capacity of the current conditions to deliver uninterrupted, long-term prosperity. </p>
<p>This analysis has been sadly lacking from the policy discussions of the mining boom, and so too have the lessons of previous mining cycles. </p>
<p>History tells us that mining is subject to boom/bust cycles where production expands in relation to demand, but when demand slackens, there are sudden corrections which come in the form of contraction of investment and production, mine closures and job losses. </p>
<p>It is important to point out that there is no immediate concern that this will occur in the current context. History does however provide some lessons that are worth considering. </p>
<p>Firstly, history tells us that commodity markets can be manipulated by consumers who can exert significant bargaining power. As an export rather than domestic producer of mineral resources, Australia is subject to fluctuations in the world export trade not only from pressure from consumers but also from entry of other suppliers. </p>
<p>Secondly, decisions about investment and production are based on forecasts of future demand, and consumers have a vested interest in over-forecasting their demand. The difficulty for resources firms is that there are long lead times between investment and actual production, meaning today’s investment must accurately predict consumer demand years into the future (as well as the investment decisions of other producers). </p>
<p>It is also in the interest of purchasers of export commodities to ensure security of supply. This means not only negotiating with many producers but also, where possible, contracting with suppliers of alternative fuels for energy or steel production. </p>
<p>These two consumer strategies are closely linked. Thus, to the extent that there is an over-investment in production across the resources sector, based on inflated demand, consumers are able to exert pressure on producers to reduce price. </p>
<p>This can and has occurred where a deal is struck with one key producer that provides for additional tonnage for that producer and potentially higher revenue, but at a lower unit price. </p>
<p>The danger is that this sets a new benchmark price and all produces must adjust. Even if expansionary conditions continue, and demand forecasts are accurate, it is questionable whether current commodity prices are sustainable (that is, affordable) for the end users of Australia’s key export resources. </p>
<p>History also tells us that employers face challenges in creating mining workforces, particularly in remote locations. While the nature of the discussion about the mining boom has been on whether expanding employment can be filled locally or with overseas workers, a key issue that has received less attention in this discussion is the associated benefits employers derive from experimenting with non-standard employment agreements, particularly where there are concerns about long-term employment prospects.</p>
<p>The Deloitte report may help to re-focus the current debate to shed light on why atypical employment is so attractive to resources employers who are looking forward to a slowing of the strong part of the two speed economy.</p><img src="https://counter.theconversation.com/content/8390/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Michael Barry does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Much public discussion around the current mining boom focuses on the lack of qualified staff to fill an expanding employment market. But yesterday’s report by Deloitte Access Economics warning that the…Michael Barry, Head of Department, Griffith Business School, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/74932012-06-12T20:26:10Z2012-06-12T20:26:10ZStrength in numbers: do ERA rankings add up for universities?<figure><img src="https://images.theconversation.com/files/11609/original/6ckn7729-1339466524.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How best to quantify the performance of Australian researchers?</span> <span class="attribution"><span class="source">Storyvillegirl</span></span></figcaption></figure><p>The <a href="http://www.arc.gov.au/era/">Excellence in Research for Australia Initiative (ERA)</a> is the federal government’s latest attempt to quantify the “excellence” (or otherwise) of Australian researchers.</p>
<p>And just a few short weeks ago submissions closed for ERA 2012, to the great relief of university research offices around Australia.</p>
<p>Unlike the dreaded <a href="http://www.innovation.gov.au/RESEARCH/RESEARCHBLOCKGRANTS/Pages/HigherEducationResearchDataCollection.aspx">Higher Education Research Data Collection (HERDC)</a> exercise – which rewards universities for pumping out as many papers as possible, even if they’re of low impact - many of the aims of the ERA process are to be welcomed. ERA combines an assessment of quality and quantity.</p>
<p>And that’s a good thing, because bad research isn’t worth doing (or funding).</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/11602/original/qbrtrswt-1339466311.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/11602/original/qbrtrswt-1339466311.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/11602/original/qbrtrswt-1339466311.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/11602/original/qbrtrswt-1339466311.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/11602/original/qbrtrswt-1339466311.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/11602/original/qbrtrswt-1339466311.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/11602/original/qbrtrswt-1339466311.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/11602/original/qbrtrswt-1339466311.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Leo Reynolds</span></span>
</figcaption>
</figure>
<p>In my experience, Australian academics used to be uniformly of the opinion their research was of world class; or that, if it wasn’t, it was rapidly getting better; or that, if it wasn’t for all their teaching and administrative duties, they’d be awesome.</p>
<p>And then <a href="http://www.arc.gov.au/era/era_2010/outcomes_2010.htm">ERA 2010</a> came along.</p>
<p>In ERA 2010 the <a href="http://www.go8.edu.au/">Group of Eight (GO8) universities</a>, which get the bulk of competitive grant funding, did pretty well. Lots of ERA 4s and 5s with a few blemishes but, on the whole, the results were reassuring for the government (and the taxpayer).</p>
<p>The emerging universities didn’t do so well. The odd 5 and 4, some 3s, many 2s, and even (gulp) 1s.</p>
<p>In case you haven’t worked it out yet, an ERA 5 rating means well above world standard; ERA 1 means the opposite.</p>
<p>Universities brag about high ERA scores in the same way first-year undergraduates have their <a href="http://www.uac.edu.au/undergraduate/atar/">Australian Tertiary Admissions Rank (ATAR)</a> scores tattooed to their foreheads.</p>
<h2>A change for the better</h2>
<p>ERA 2010 measured the period 2003-2008 inclusive, and the newer research institutions might argue their staff and outputs, as of 2012, are better than from the middle of last decade. To some extent they are probably right.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/11603/original/cc5k7zkz-1339466369.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/11603/original/cc5k7zkz-1339466369.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/11603/original/cc5k7zkz-1339466369.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/11603/original/cc5k7zkz-1339466369.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/11603/original/cc5k7zkz-1339466369.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/11603/original/cc5k7zkz-1339466369.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/11603/original/cc5k7zkz-1339466369.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/11603/original/cc5k7zkz-1339466369.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Leo Reynolds</span></span>
</figcaption>
</figure>
<p>The only problem is that when the next change of government occurs we can’t possibly expect the Coalition to want to keep the current system, because that would be admitting the Labor government did something right. Which, as improbable as it sounds, has many of us wondering whether perhaps this really is the end of (an) ERA?</p>
<p>Fortunately, ERA can be greatly improved. So apart from coming up with a different acronym, how could the Coalition change ERA for the better?</p>
<p><a href="http://website.lnp.org.au/senator-brett-mason">Senator Mason</a> take note!</p>
<h2>Hidden codes</h2>
<p>When the ERA 2012 assessment is completed later this year, we’ll all know what the <a href="http://www.arc.gov.au/">Australian Research Council (ARC)</a> thinks of our universities, succinctly distilled into a single number, between 1 and 5 (whole numbers only) for each research grouping.</p>
<p>And therein lies the problem: one whole number (or integer) for each field of research. So although there may be 100 researchers
in a given discipline area at any given university, their ERA ranking will be represented by a single digit, regardless of each individual’s own score.</p>
<p>Furthermore, it’s not hard to imagine that a lot of time and effort has been spent by university administrators cleverly “hiding” their poorer researchers and outputs in “ballast” four-digit codes.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/11604/original/jg8ry5xd-1339466405.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/11604/original/jg8ry5xd-1339466405.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/11604/original/jg8ry5xd-1339466405.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/11604/original/jg8ry5xd-1339466405.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/11604/original/jg8ry5xd-1339466405.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/11604/original/jg8ry5xd-1339466405.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/11604/original/jg8ry5xd-1339466405.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/11604/original/jg8ry5xd-1339466405.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Leo Reynolds</span></span>
</figcaption>
</figure>
<p>Some research outputs span areas, and can be submitted
under different research codes. So if a university
takes all of the low-quality outputs and places them
into a sacrificial code, and writes off the relevant
authors, it can strengthen other areas that it aims for
a high score in. </p>
<p>Since ERA doesn’t report on the dimension of the research grouping, it becomes tempting to maximise the number of highly-ranked disciplines, even if they are tiny.</p>
<p>On paper, four 5s and a 1 looks a lot better than two 4s and two 2s, which might be achieved via some relabelling of some research outputs.</p>
<p>Of course the silliness here is that the gross output of the university doesn’t change just because you’ve managed to hide your poorly-performing researchers in a few codes you are prepared to sacrifice. </p>
<p>Doesn’t the government want universities to be doing more than manipulating research classifications? Hopefully, yes!</p>
<h2>A flawed procedure</h2>
<p>For some strange reason, as with electrons orbiting atomic nuclei, the ARC wants to force collections of researchers into “quantum states”. So although the raw scores might have left your ERA grouping at 4.49, you’ll probably get truncated back to a 4. </p>
<p>One more publication might have made you a 4.51 and delivered the magical 5 rating!</p>
<p>Ideally the ARC could publish a <a href="http://en.wikipedia.org/wiki/Histogram">histogram</a> of each individual’s own rating within a discipline, between 1.0 and 5.0, and then averages, standard deviations, medians, maybe even <a href="http://en.wikipedia.org/wiki/Skewness">skewness</a>. This would avoid quantisation errors, and allow a truer representation of research excellence from each discipline.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/11605/original/rd7y6bzk-1339466433.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/11605/original/rd7y6bzk-1339466433.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/11605/original/rd7y6bzk-1339466433.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/11605/original/rd7y6bzk-1339466433.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/11605/original/rd7y6bzk-1339466433.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/11605/original/rd7y6bzk-1339466433.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/11605/original/rd7y6bzk-1339466433.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/11605/original/rd7y6bzk-1339466433.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Leo Reynolds</span></span>
</figcaption>
</figure>
<p>Extending this further, one could imagine a dot on a <a href="http://en.wikipedia.org/wiki/Scatter_plot">scatter diagram</a> that showed the impact of each publication within a code on the y-axis and the individual researchers on the x-axis.</p>
<p>Then we could get a feeling of whether a group’s outputs were dominated by one individual, dragged down by a few part-timers, or of high quality but limited in number.</p>
<p>But the deeper you start to look at these measures, the more you realise the flaws.</p>
<p>In some areas impact is relatively easy to measure from citations on short timescales but, alas, not in all.</p>
<p>Within a discipline, some areas and activities cite extremely well, and in others, such as instrumentation, not so well. In some disciplines, such as mathematics, citations are almost meaningless.</p>
<p>And once we start talking about “esteem” factors – such as editorial boards, members of the academy, the relative worth of a <a href="http://www.nobelprize.org/">Nobel Prize</a> to a Fellow of the <a href="http://royalsociety.org/">Royal Society</a> – it all starts getting a bit arbitrary.</p>
<p>How many Nature papers are the equivalent of a <a href="http://www.nobelprize.org/">Nobel Prize</a> or membership of an editorial board? Do two ten-citation papers become equivalent to one 20-citation paper? Is a 5 researcher plus a 1 researcher equal to two 3 researchers?</p>
<h2>Tough measures</h2>
<p>Ultimately, the government might have to accept that, as with the momentum and position of a subatomic particle, research excellence is impossible to quantify.</p>
<p>And yet in 2012 a portion of the university’s income to help with the indirect costs of research, the so-called “Sustainable Research Excellence Threshold 2 funding”,
used the ERA results to allocate funding. ERA 5s being seven-times the value of a 3 and 1s and 2s being worthless.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/11606/original/rjfxrrvn-1339466458.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/11606/original/rjfxrrvn-1339466458.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/11606/original/rjfxrrvn-1339466458.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/11606/original/rjfxrrvn-1339466458.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/11606/original/rjfxrrvn-1339466458.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/11606/original/rjfxrrvn-1339466458.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/11606/original/rjfxrrvn-1339466458.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/11606/original/rjfxrrvn-1339466458.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Leo Reynolds</span></span>
</figcaption>
</figure>
<p>The problem for the younger institutions was that an ERA 5 at a poorly performing “average research income per EFT” university got much less funding than an ERA 5 at a uniformly excellent one. </p>
<p>Why? Well the total dollar amount for each ERA grouping was multiplied by the average category 1 research income per academic, so poor areas contributed nothing, and diluted the income rewards of the stronger ones.</p>
<p>So much for the incentive for the fledgling research universities to be rewarded for <a href="http://www.deewr.gov.au/HigherEducation/Policy/Documents/ANUsignedIA.pdf">concentrating their research efforts</a>! </p>
<p>This meant the Go8 universities cleaned up with this funding change, as the majority of their codes were ERA 4 and above, and their average research income much higher. If one
differences the <a href="http://www.innovation.gov.au/Research/ResearchBlockGrants/Documents/2012RBGAllocations.xls">SRE 2012</a> and <a href="http://www.innovation.gov.au/Research/ResearchBlockGrants/Documents/2011RBGAllocations.xls">2011</a> numbers, we see that the University of Queensland pocketed an extra A$5.8 million this year – the Queensland University of Technology just A$15,000. </p>
<h2>Teaching kills</h2>
<p>The other, major problem with the whole ERA concept is that most university academics teach. And this, ladies and gentlemen, is the killer. Why? Because, unlike research quality, the government doesn’t seem to mind what your teaching quality is like.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/11608/original/h387j8sr-1339466500.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/11608/original/h387j8sr-1339466500.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/11608/original/h387j8sr-1339466500.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/11608/original/h387j8sr-1339466500.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/11608/original/h387j8sr-1339466500.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/11608/original/h387j8sr-1339466500.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/11608/original/h387j8sr-1339466500.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/11608/original/h387j8sr-1339466500.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><span class="source">Leo Reynolds</span></span>
</figcaption>
</figure>
<p>University funding is independent of teaching quality.</p>
<p>And quality teaching takes time, and that time makes it harder to do research, especially quality research.</p>
<p>So the incentive is clear: if you are going to do research, heavily concentrate in one area and minimise any time those people spend on teaching.</p>
<p>Don’t set teaching assignments such as essays because they take time to mark. Go for multiple choice answers instead and perhaps lump most of the assessment into the end-of-term exam? </p>
<p>That will get your <a href="http://www.mis.admin.uq.edu.au/Content/Dashboards/QIndex.aspx">Q-index</a> (a daily measure of an individual’s research worth used at the University of Queensland) firing on all cylinders!</p>
<p>It’s also worth using sessional staff or teaching-only academics wherever possible to not dilute your research effort.</p>
<p>Just do what you can to propel your university up the research rankings and gain as many ERA 5s as possible!</p>
<p>That will make your VC truly happy, and might even allow them to <a href="http://www.abc.net.au/news/2012-05-14/anu-marathon-concert-protest-music-cuts/4009148">keep your music schools</a> and <a href="http://economics.com.au/?p=8450">arts faculties</a> open.</p>
<p><em>Professor Matthew Bailes is a member of the ERA-5 rated Centre for Astrophysics and Supercomputing at the Swinburne University of Technology.</em></p><img src="https://counter.theconversation.com/content/7493/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Matthew Bailes receives funding from the Australian Research Council and supercomputing vendors for his research.
</span></em></p>The Excellence in Research for Australia Initiative (ERA) is the federal government’s latest attempt to quantify the “excellence” (or otherwise) of Australian researchers. And just a few short weeks ago…Matthew Bailes, Pro-Vice Chancellor (Research) , Swinburne University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/68202012-05-04T02:32:36Z2012-05-04T02:32:36ZThe ‘impact’ of research carries weight (but ripples matter more)<figure><img src="https://images.theconversation.com/files/10293/original/9zymrswc-1336008537.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Do we need new vocabulary for measuring the “engagement”, “use”, “relevance” and “appropriateness” of research?</span> <span class="attribution"><span class="source">spettacolopuro</span></span></figcaption></figure><p>What has been the impact of the <a href="http://galileo.rice.edu/sci/instruments/telescope.html">invention of the telescope</a>? What has been the impact of <a href="https://theconversation.com/explainer-einsteins-theory-of-general-relativity-3481">Einstein’s Theory of General Relativity</a>, or the <a href="http://www.treehugger.com/corporate-responsibility/man-arrested-for-attempting-to-split-the-atom-in-his-kitchen.html">splitting of the atom</a>? </p>
<p>Yes, that’s right: the idea of measuring the “impact” of research is back in a big way. Within the research community and within government, plenty of people are <a href="https://theconversation.com/group-of-eight-view-of-measuring-the-impact-of-research-4818">thinking about this</a> in 2012. </p>
<p>As many have acknowledged, the Federal Government’s current <a href="http://www.arc.gov.au/era/">Excellence in Research for Australia</a> (ERA) initiative provides a <a href="http://www.atn.edu.au/atnconference/2011/atn-go8_symposium/report_of_2011_atn-go8_symposium.pdf">strong evaluation</a> of the quality of the research conducted in Australian universities, but doesn’t necessarily tell us much at all about the impacts of this research in the broader community. </p>
<p>The government’s 2011 review, Focusing Australia’s Publicly Funded Research, <a href="http://www.innovation.gov.au/Research/Pages/FocusingAustraliasPubliclyFundedResearch.aspx">recommended</a> a feasibility study be undertaken by the Department of Industry, Innovation, Science, Research and Tertiary Education on “possible approaches for developing a rigorous, transparent, system wide Australian research impact assessment mechanism”. </p>
<p>This will build upon work already underway across the university sector and in <a href="https://theconversation.com/institutions/csiro">CSIRO</a>. </p>
<p>Making more of an effort to understand how research interacts with the broader community is – to state our opinion up front – A Good Thing. It promotes thinking about the outside world – encouraging engagement beyond a particular academic discipline and awareness of the interests of the people actually funding our work, and the issues they might deem important. </p>
<p>It also focuses effort on clearly articulating the many ways in which our investments in research deliver benefits for society.</p>
<p>Yet perhaps in this nascent discussion about impact we have put the cart before the horse. Perhaps we have allowed the conversation to get away from us before we’ve had a chance to think through what it is we actually want to achieve in our governance of the Australian research system, and what we want to measure and reward. When it comes down to it, is “impact” even the right word? </p>
<p>“Impact” sounds like a concept from the world of physics – a scientisation of the very language we might use to talk about research and its place in society. “Impact” seems to denote a process that can be rational, can be measured – where bigger would equal better. </p>
<p>It also seems to describe a singular effect from research activity – someone does lots of work, and then there is an impact. Bang. Done. </p>
<p>But isn’t the age of linear cause and effect supposed to be over? Aren’t we supposed to be living in a more complicated, more contingent age of overlapping fields, where innovation happens at the boundaries? </p>
<p>To talk of “impact” in a singular, physical way is to slip back to a simple linear model of research and innovation. The dominant measures of the “impact” of research and innovation – dollars, people, publications and patents – still reinforce this model. </p>
<p>The problem is, <a href="http://sciencepolicy.colorado.edu/publications/special/honest_broker/index.html">decades of research on research and innovation</a> have shown that the process is neither this simple nor this linear.</p>
<p>And, of course, impact isn’t either. Research is part of, and contributes to, the complicated and overlapping worlds of human affairs. It shapes, and is shaped by, broader society. The tentacles of impact stretch into the past and far off into the future. </p>
<p>Which is not to say impact cannot be measured at all. We believe there are many opportunities to enhance the metrics of research and innovation, and that this is important work – it is crucial that individual researchers, research organisations and governments are engaged in the discussion. </p>
<p>But there are two key points – often overlooked – that must frame how this work progresses. </p>
<p>1) The new knowledge and new tools that stem from research do not create singular, one-off “impact”. Research activity leads to multiple impacts in different locations and different times. </p>
<p>2) Some of these impacts will be seen as positive by certain people in certain places and times, while others will be seen as neutral or even negative. </p>
<p>The word “impact” itself contains no normative assessment, yet many seem to be using it as a synonym for benefit. If we are going to assess research impact systematically, we will need to start to account for multiple impacts. </p>
<p>Consider the impact of the development of the <a href="http://en.wikipedia.org/wiki/Cochlear_implant">cochlear implant</a>. Hundreds of thousands of people have become able to hear, living lives that are (probably) easier and (possibly) richer. How would we measure this? </p>
<p>Much money has been made, and many jobs created. Simultaneously, many in the deaf community have come to see the technology as a form of “<a href="http://archie.kumc.edu/bitstream/handle/2271/848/STT-JUNW_2010_Heffley_Pediatric-Cochlear-Implants.pdf?sequence=1">cultural genocide</a>”. Should this be taken into account when assessing impact? </p>
<p>Researchers have also studied the introduction of new agricultural technologies, such as the <a href="http://news.ucdavis.edu/search/news_detail.lasso?id=7521">tomato harvester</a>, and their social, economic and environmental impacts. </p>
<p>While productivity and profitability rose with the introduction of certain technologies, this was also accompanied by job losses among certain classes of workers and the restructuring of farm holdings, gender roles in the workforce, and regional communities. </p>
<p>All of this raises important questions of accountability. Individual researchers would rightly be nervous about being measured and rewarded against such broad, long-term impacts, over which they have little or no control. So who should be held accountable for what? </p>
<p>If we are seeking to improve our assessment of the impacts of research in the wider community, what is the role for researchers and research organisations, and what is the role for government and the public?</p>
<p>Perhaps we should start by not jumping straight to “impact”. It’s not a simple linear process, but there are some things that happen between research and societal impacts, and perhaps these are things we should start to talk about and measure more. </p>
<p>Things such as “engagement” and “use”, and “relevance” and “appropriateness”. We need to pair the quantitative with the qualitative as we seek to better understand impacts, and develop new measures of engagement and use that go beyond our current – largely scientific and economic – metrics. </p>
<p>It might prove difficult, or even impossible, to answer the question about the full, long-term impacts of a particular piece of research, but it’s important that questions are being asked. </p>
<p>If we stop looking for one single big answer and focus instead on smaller steps along the way, there is a lot that can be done.</p><img src="https://counter.theconversation.com/content/6820/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Will J Grant has received funding from the Department of Industry, Innovation, Science, Research and Tertiary Education.</span></em></p><p class="fine-print"><em><span>The HC Coombs Policy Forum at ANU receives Australian Government funding through the "Enhancing Public Policy" initiative.</span></em></p>What has been the impact of the invention of the telescope? What has been the impact of Einstein’s Theory of General Relativity, or the splitting of the atom? Yes, that’s right: the idea of measuring the…Will J Grant, Researcher / Lecturer, Australian National Centre for the Public Awareness of Science, Australian National UniversityPaul Harris, Deputy Director, HC Coombs Policy Forum, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/56492012-03-14T00:53:45Z2012-03-14T00:53:45ZUntangling red tape to turn academics into public intellectuals<figure><img src="https://images.theconversation.com/files/8576/original/7gbpddjj-1331681871.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Bureaucracy is stymieing academic engagement.</span> <span class="attribution"><span class="source">StripeyAnne</span></span></figcaption></figure><p>The idea that universities should return to their “core business” of teaching and research has become a favourite mantra of vice chancellors. It is reinforced by increasing evaluations imposed by Canberra to determine funding models. </p>
<p>Indeed, academics now spend more and more time answering questionnaires imposed by the federal government. A government who seems unconcerned with how its obsession for monitoring might detract from what it tells us we should be doing.</p>
<p>But universities also play a number of crucial roles in the intellectual, cultural and political life of the country. </p>
<p>Whatever the limits of the term “public intellectual” — which seems to suggest people who speak out on every topic bar those where they have expertise — it is an obligation of universities to provide the opportunity for its staff to engage in public debate and challenge received wisdom.</p>
<p>Doing this might be understood as part of “scholarship” in its fullest meaning, a term that seems to have disappeared in the emphasis on “research”, usually understood by politicians to mean developing a new technique immediately applicable in industry or medicine. </p>
<h2>Keeping academics under wraps</h2>
<p>It is a great irony that it was a Labor government who removed “impact” from its <a href="http://www.arc.gov.au/era/">Excellence in Research and Ranking</a> (ERA) exercise. This privileged the kind of traditional academic publishing which is particularly accessible to certain sorts of disciplines.</p>
<p>The same government that wants to increase participation in higher education from those with low socioeconomic status is simultaneously demanding a time-consuming and old fashioned exercise in research evaluation that works against innovation in scholarship and certainly against enhancing public debate.</p>
<p>Most of us would be delighted were we able to publish a long review article in the New York Review of Books, as only very few Australian academics — Tim Flannery and Peter Singer come to mind — have succeeded in doing. Yet such an article would count less in the ERA exercise than a piece in a “refereed journal” which might be read by three people, namely the editor and the two reviewers. </p>
<p>My junior colleagues lament the fact that the system discourages them from writing for a broader audience than academic specialists.</p>
<h2>Thankless tasks</h2>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/8579/original/st3yp8x7-1331682482.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/8579/original/st3yp8x7-1331682482.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=905&fit=crop&dpr=1 600w, https://images.theconversation.com/files/8579/original/st3yp8x7-1331682482.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=905&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/8579/original/st3yp8x7-1331682482.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=905&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/8579/original/st3yp8x7-1331682482.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1137&fit=crop&dpr=1 754w, https://images.theconversation.com/files/8579/original/st3yp8x7-1331682482.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1137&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/8579/original/st3yp8x7-1331682482.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1137&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Altman’s seminal 1971 work.</span>
<span class="attribution"><span class="source">UQP</span></span>
</figcaption>
</figure>
<p>Over the past few months I have been engaged in a range of projects that I would argue contribute towards intellectual and public life. None of them is easily recorded in the evaluations we are required to make to Canberra. </p>
<p>At the risk of sounding overly self-promoting let me list some of these.</p>
<p>My first book <a href="http://www.uqp.uq.edu.au/Book.aspx/1175/Homosexual-%20Opression%20and%20Liberation">Homosexual: Oppression & Liberation</a> was republished. A <a href="http://www.alga.org.au/2011/389">major conference</a> and several public events took place in conjunction with this anniversary. This will certainly lead to a number of publications, but the sole recognition I can claim as a “researcher” is the short new introduction I wrote for the current edition.</p>
<p>Second, I am currently spending a lot of time helping develop the program of the <a href="http://www.aids2012.org/">International AIDS Conference</a> in Washington this year. These conferences are so important that the federal and Victorian governments have already pledged $2.5 million to ensure the 2014 Conference will take place in Melbourne. They have allowed me to help develop connections between biomedical and social researchers. But how does one report on this in an ERA framework?</p>
<p>Third, I co-chair a network of academics and development NGOs who are trying to expand the connections between development scholars and development NGOs. We have now held three successful conferences, and <a href="http://www.ausaid.gov.au/">AusAID</a> is likely to fund a position to further develop the network. As development studies is, sorry for the pun, under-developed in Australia, and as Australia is set to become one of the more substantial donors on international assistance, this is important for both academic and political reasons.</p>
<h2>Second fiddle to hacks and jokers</h2>
<p>In the time these three activities have taken, I could have probably written quite a few articles for peer-reviewed journals and applied for several grants. </p>
<p>As someone who is about to retire, and also with the blessings of a supportive Deputy Vice Chancellor, Dean and school, it is possible for me to spend time on these sort of activities. But for emerging academics there will be less freedom to do these sort of activities, and as a result universities will seem increasingly irrelevant to public life. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/8577/original/6x8rx2h2-1331682317.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/8577/original/6x8rx2h2-1331682317.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/8577/original/6x8rx2h2-1331682317.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/8577/original/6x8rx2h2-1331682317.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/8577/original/6x8rx2h2-1331682317.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/8577/original/6x8rx2h2-1331682317.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/8577/original/6x8rx2h2-1331682317.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Dennis Altman in a rare appearance from an academic on Q&A.</span>
<span class="attribution"><span class="source">ABC Television</span></span>
</figcaption>
</figure>
<p>Already even the serious media rarely turns to universities for expertise. Programs like <a href="http://www.abc.net.au/tv/qanda/">Q&A</a> seem to prefer comedians and political hacks to academics. I say this as someone who has been on the program and cannot, therefore, be accused of sour grapes.</p>
<h2>Losing our best minds</h2>
<p>Last year I wrote a letter to then Science Minister Kim Carr. I suggested we could save literally tens of thousands of hours and dollars for both government and universities by radically simplifying the process of applying for ARC and NHMRC grants. </p>
<p>The amount of detail required is currently absurd: with a 20% chance of success one is now required to provide details as minute as the cost of a tape recorder for a potential research trip five years in the future. </p>
<p>If these details only required once a grant was awarded, academics would suddenly have more time to create the track record they need to actually win a grant. </p>
<p>It is an awful irony that we all spend increasing amounts of time responding to monitoring and evaluation, as Canberra simultaneously insists that Universities become less dependent on government funding.</p>
<p>Alan Ginsberg wrote that he had seen the best minds of his generation destroyed by madness. We are seeing the best minds in our universities destroyed by increasingly complex form filling.</p>
<p><em>This is a summary of a speech given at an NTEU Conference where Professor Altman was asked to speak about the role of “public intellectuals” in universities.</em></p><img src="https://counter.theconversation.com/content/5649/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dennis Altman has received funding from the ARC.</span></em></p>The idea that universities should return to their “core business” of teaching and research has become a favourite mantra of vice chancellors. It is reinforced by increasing evaluations imposed by Canberra…Dennis Altman, Professorial Fellow in Human Security, La Trobe UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/26372011-08-02T21:01:28Z2011-08-02T21:01:28ZHow one small fix could open access to research<figure><img src="https://images.theconversation.com/files/2616/original/oa.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Universities already stockpile academic papers so they can report their output to the government. But stockpiling the wrong version of the paper can restrict their right to make the paper available on open access.</span> <span class="attribution"><span class="source">Flickr/Gideon Burton</span></span></figcaption></figure><p>Providing equitable access to the findings of scholarly research is an expensive and vexed business, as many <a href="https://theconversation.com/topics/academic-journal-debate">recent stories</a> here on The Conversation have highlighted.</p>
<p>Open access offers a way to freely disseminate research. And there are big statements about open access – including the <a href="http://www.un.org/en/documents/udhr/index.shtml#a27">Universal Declaration of Human Rights</a> which says that everyone has the right “to share in scientific advancement and its benefits.”</p>
<p>Our own Minister for Innovation, Industry, Science and Research, Senator Kim Carr <a href="http://minister.innovation.gov.au/Carr/Speeches/Pages/OpenAccessandResearchConference.aspx">said</a> in 2008 that it was his ‘firm view’ that publicly funded research should be widely available to other researchers, industry and the general public. He specified he meant “full, open access to research data and outputs”.</p>
<p>Generally, researchers agree that having open access to research is a good idea. Despite all this, currently about 15% of all research is available open access. So what is stopping people from making their work available?</p>
<p>The short answer is the way researchers are rewarded.</p>
<h2>Prestige power</h2>
<p>Being an academic is a weird existence. On top of their teaching load, academics do research, write articles up about it, peer review each other’s work, and act as journal editors. They expect no personal monetary reward for this.</p>
<p>Instead, reward in academia manifests as ‘prestige’ - other people citing their work, winning grants, becoming a fellow of a society and other non-monetary returns. This is called the ‘academic gift principle’.</p>
<p>The prestige bestowed on a researcher depends strongly on where their work is published. It’s all about impact. So changes to the scholarly communication system strike fear in the hearts of many academics. This is a global situation for academia, and much as many people would like it to, it is not going to change in a hurry.</p>
<p>But open access doesn’t prevent people from publishing where they need to. The ‘green’ road to open access is when researchers publish where they choose but then deposit their final version of a paper into a digital repository. This version can be made open access if the publisher allows it.</p>
<p>Across the world, funding agencies are increasingly making it a requirement of funding that the results from research be openly accessible. These started with the <a href="http://www.wellcome.ac.uk/About-us/Policy/Policy-and-position-statements/WTD002766.htm">Wellcome Trust</a> in 2005, and have expanded exponentially since.</p>
<p>In the antipodes we are a little behind. The <a href="http://www.arc.gov.au/pdf/DP12_fundingrules.pdf">Australian Research Council Discovery Grants rules</a> for 2012 ask recipients to justify why they don’t make their work open access but they don’t require it. The National Health and Medical Research Council promises to be more progressive. Indeed the NHMRC Chief Executive Professor Warwick Anderson mentioned that they “wanted to mandate public access to publications within 12 months using University repositories’ at the <a href="http://researchmanagement.org.au/docs/Universities_Australia_National_Policy_Forum.pdf">Universities Australia National Policy Forum</a> in October last year. But there has yet to be an official statement to this end.</p>
<p>But while welcome, this will only make available research that has come off the back of funding grants, rather than all of the Australian research currently being published.</p>
<h2>One small fix</h2>
<p>So is there a solution for making Australian research open access? Yes. And it is a relatively simple change.</p>
<p>Currently <a href="http://www.innovation.gov.au/Research/ResearchBlockGrants/Pages/HigherEducationResearchDataCollection.aspx">all universities collect</a> information about, and a copy of, every research article written by their academics each year. Some of Government funding to universities stems from this.</p>
<p>So we already have a stockpile of all the research that is being done in Australia. But the version of the papers collected is the Publisher’s PDF. And in most cases this is the version we cannot make open access through digital repositories.</p>
<p>In addition, the Government has funded all universities in Australia to build an institutional repository to allow reporting to the Excellence in Research for Australia (ERA) program.</p>
<p>So the infrastructure is there and the processes are already in place. But there is one small change that has to happen before we can enjoy substantive access to Australian research.</p>
<p>The Government must specify that they require the Accepted Version (the final peer reviewed, corrected version) of the papers rather than the Publisher’s PDF for reporting.</p>
<p>If this happened, universities could just trawl the collection, checking the copyright arrangements of the publishers. Then with a (virtual) flick of a switch we move from 10-15% of material available to over 80% within the first year of reporting.</p>
<p>And even if the Government starts using ERA for funding allocation (which is likely), the process of collecting publication information is already established within all Australian universities.</p>
<p>The hardest type of change is behavioural. And this solution almost completely avoids behavioural change.</p>
<p>So come on Senator Carr – how about acting on your fighting words from 2008?</p><img src="https://counter.theconversation.com/content/2637/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Danny Kingsley does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Providing equitable access to the findings of scholarly research is an expensive and vexed business, as many recent stories here on The Conversation have highlighted. Open access offers a way to freely…Danny Kingsley, Executive Officer for the Australian Open Access Support Group, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/25172011-07-27T20:53:46Z2011-07-27T20:53:46ZCopyright or copywrong? How journals control access to research<figure><img src="https://images.theconversation.com/files/2517/original/creativecommons.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Frustration with copyright restrictions placed on scholarly work in many journals has helped fuel the Creative Commons and Open Access movements.</span> <span class="attribution"><span class="source">Flickr/TilarX</span></span></figcaption></figure><p>Back in 1991, in the very earliest days of the internet, a group of high energy physicists began sharing their findings on a Los Alamos-based online archive called <a href="http://arxiv.org/">Arxiv</a>.</p>
<p>Their early experiments in the sharing of scholarly work helped spark what has become the Open Access movement, a worldwide push for more equitable access to research beyond the boundaries of scholarly journals.</p>
<p>The movement has had a somewhat haphazard trajectory but growing it is, as a glance at the <a href="http://www.openaccessmap.org">Open Access Map</a> attests. </p>
<p>This growth has been fuelled by policy developed by research funders, by institutions and by government.</p>
<p>Many research funders now make it a requirement of the receipt of a grant that research results be made available in Open Access.</p>
<p>Here, Open Access explicitly means that the commercial publishing industry will not have a monopoly on the expression of the findings of the research. This is a particularly important point in the developing world.</p>
<p>U.S. funding agency, the <a href="http://www.nih.gov/">National Institutes of Health</a> has taken a view that if taxpayer dollars are supporting research, that research should not be confined to an elite which has the privilege of access granted through institutional or individual subscription.</p>
<h2>The green road and the gold road</h2>
<p>Funder and government mandates are developing in different jurisdictions at different paces, but have become a clear part of the Open Access picture since the mid-2000s.</p>
<p>Institutional mandates have also developed, with academic institutions building <a href="http://roarmap.eprints.org/">repositories of scholarly work</a> to boost access to their research output.</p>
<p>This approach, in which an author or institution archives the work as submitted for publication in an online repository accessible at the same time as it is accepted, is often referred to as “the green road” to Open Access. </p>
<p>In some universities, the policy has been in place for time, either for the whole institution or at the faculty or school level.</p>
<p>Queensland University of Technology (QUT) adopted a whole institution approach in 2004. There is some early evidence that the greater visibility of the research of the institution correlates with other measures of research esteem.</p>
<p>In Australia in recent years, several other universities have developed policies, most notably this week, <a href="http://theconversation.com/making-knowledge-free-anu-launches-open-access-research-database-2490">ANU</a>. A number of institutions in Europe – Leige, Salford, University College London, to mention a few – are examples. Perhaps most conspicuous is the mandate adopted at <a href="http://roarmap.eprints.org/75/">Harvard University Faculty of Art and Sciences.</a></p>
<p>An alternative to the green road is the gold road, where a journal provides its content on open access if the author pays an ‘input fee’ at the time of submission. One of the difficulties with this is the availability of the funds for such input fees. In some cases, funders have made it clear that the input fee is covered by the research grant.</p>
<h2>Who benefits?</h2>
<p>The fact is that the overwhelming majority of articles published in the traditional journal literature are given away by their authors, are refereed gratis by colleagues in the peer review process and are then published.</p>
<p>There is no individual return to the author. There is no return to the referee. But there is significant revenue generated for publishers reselling this content back to the institutions where the vast majority of scholarly authors work and reside. </p>
<p>By the 1990s, this model was attracting conspicuous and repeated commentary.</p>
<p>It was natural to wonder whether there might be other models for achieving the same goal – the sharing of research outputs.</p>
<p>Subscription-based access is confined to those who are entitled through the payment of the subscription price, with access controlled through electronic authentication systems online. This automatically excludes those who lack the ability to pay.</p>
<p>Therefore, the subscription-based approach has significant features which have come to be seen as disadvantages: inflated and inconsistent pricing and limited access. </p>
<p>Importantly, the open access argument does not apply to literature that does return royalties to the author. Indeed, the less money universities tie up in the expensive acquisition of the refereed research literature, the more would be available to buy publications that do return royalties to authors and their publishers.</p>
<h2>Creative Commons</h2>
<p>Publishers assert ownership of the scholarly content of their journals through copyright, which the author usually assigns to the publisher when their article is accepted for publication. </p>
<p>Open content licensing developed as a response to some of the quandaries posed by the concept of copyright as an exclusive control on expressions of ideas.</p>
<p>Perhaps best known of these is the <a href="http://creativecommons.org/">Creative Commons</a> set of licences, generically international but ported to national jurisdictions. QUT was the lead institution for this in Australia.</p>
<p>QUT’s policy on open access made it clear that the material concerned would not be located in the repository if it was “to be commercialised … contains confidential material, or of which the promulgation would infringe a legal commitment by the university and/or the author.”</p>
<p>Evidence is developing that citation frequency increases with the greater visibility of research. Other benefits include stronger linkages between researchers and wider communities and the attraction of higher degree research students based on greater visibility of existing research fields.</p>
<p>At a recent conference at <a href="http://public.web.cern.ch/public/">CERN</a> in Geneva, the editor of one of the world’s larger open access journals proposed that, eventually, most scholarly material will be available through five or six major gold providers. He was keen to point out that this is merely a speculation. </p>
<p>Or a tipping point may be reached with the provision of material through institutional and, in some cases, disciplinary repositories. It is hard to make precise predictions but the overall trend is undeniable. </p>
<p>The world of traditional scholarly publishing may well co-exist for a while, adding value to scholarly material in a way that national and international academic communities are willing to pay for.</p>
<p>Alternatively, publishers may abandon journal titles per se as a unit of economic (and quality) currency, and seek new business in which they extend their role in providing sophisticated and reliable integrity checks on the quality of research articles.</p>
<p>The pricing for such activity will be different from past commerce in these goods and the traditional publishing model will inevitably undergo further significant change.</p><img src="https://counter.theconversation.com/content/2517/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tom Cochrane's external duties include Chair, Australian eResearch Infrastructure Council; Director, Australian Digital Alliance; Chair, Australian Libraries’ Copyright Committee; and Director, Queensland Cyber Infrastructure Foundation. He is also a member of the National Research Infrastructure Council, a member of the Publications Board of CSIRO, and is on the Board of The Queensland Museum. Professor Cochrane is also a Director on the Board of the University’s commercialisation company, QUT bluebox.
He was co-leader of the Creative Commons project for which QUT is the institutional partner for Australia and led the approach mandating open access for refereed research produced by the University into formal policy in 2003.</span></em></p>Back in 1991, in the very earliest days of the internet, a group of high energy physicists began sharing their findings on a Los Alamos-based online archive called Arxiv. Their early experiments in the…Tom Cochrane, Deputy Vice-Chancellor Technology, Information and Learning Support (TILS), Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/25112011-07-26T21:03:15Z2011-07-26T21:03:15ZExplainer: Open access vs traditional academic journal publishers<figure><img src="https://images.theconversation.com/files/2485/original/databasebooks.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A growing number of academic institutions are building free online databases of their scholarly output. But publication in a big name academic journal still holds cachet for most academics. </span> <span class="attribution"><span class="source">Flickr/mandiberg</span></span></figcaption></figure><p>As the cost of accessing academic journal articles increases, a growing number of academic institutions are building publicly accessible databases of scholarly work.</p>
<p>But how much of a threat to the traditional subscription-based academic journal model does the open access movement really pose?</p>
<p>In this Q+A, Stephen Cramond, Electronic Content Manager in the University of Melbourne Library outlines some of the issues in the debate.</p>
<hr>
<p><strong>How much do universities spend for subscriptions to top journals?</strong></p>
<p>Some journals can can cost as much as $25,000 per annum for a subscription. An institution the size of University of Melbourne could be paying around $10 million a year for the journals to which it subscribes. And because most journals are published in the US or Europe, even slight variations in foreign exchange can play havoc with the rest of the Library budget.</p>
<p>That $25,000 per annum figure is an outlier and the amount per journal title is quite discipline specific. The further into the physical and life sciences you go, the more expensive they tend to be. The further into the arts and humanities you go, the cheaper they tend to be.</p>
<p>That’s partly an reflection of the number of papers that are being produced in those disciplines. The more papers there are, the more expensive the journal becomes. </p>
<p><strong>So what trends are developing in the field of open access publishing?</strong></p>
<p>Many institutions, including <a href="http://dspace.mit.edu/handle/1721.1/49433">MIT</a> and the <a href="http://library.unimelb.edu.au/digitalcollections">University of Melbourne</a>, have developed what’s called an institutional repository. That’s a database where they seek to make available on open access – free at the point of use - the research output of their researchers. MIT, along with a small but growing number of universities, have mandated that their researchers must do this.</p>
<p>They are taking advantaging of an out-clause that many publishers put in place by which an academic can post to his or her website a version of the publication that’s going to the journal which is not the publishers own version, not their marked up copy.</p>
<p>A version of the article before peer review is called a pre-print and a the final draft of the article after peer review is called post-print.</p>
<p>About 65% of journal publishers allow for either pre-print or post-print versions of those papers to be made available in that way. </p>
<p>So that’s what MIT is doing, that’s what parts of Harvard are doing. There is a <a href="http://roarmap.eprints.org/">register of institutions</a> who have either a partial or complete mandate of this kind. </p>
<p><strong>Can people outside of those institutions access those repositories?</strong></p>
<p>Absolutely, that’s the intention. You don’t have to go to those repositories to access information, the public search engines like Google or <a href="http://scholar.google.com.au/">Google Scholar</a> and so on can harvest their content and make their contents findable. </p>
<p><strong>So how much of a threat does this model pose to the traditional subscription-based journal system?</strong></p>
<p>Originally, going back 10 years, some of the original proponents of the open access movement would have seen it as somewhat subversive, as an attempt to reform the existing journal system.</p>
<p>Ten years have passed and it’s arguable how much traction that kind of open access model has had.</p>
<p>However, open access and the broader changes in university and university library budgets have forced the traditional publishers to respond and they have responded in two ways.</p>
<p>Firstly, by making information and subscription based journals more affordable to their customer base – that is, universities and university libraries. They have developed what’s known as the Big Deal. That means if you sign up for, say, three years you get access online not only to the journals to which you have been subscribing in print until now but also the complete portfolio of publisher X or publisher Y, so every journal that they publish.</p>
<p>That has changed the picture tremendously over the last 10 years and, in some ways, access in the first world to journal information has never been better.</p>
<p>But one of the costs of being locked into the Big Deal is it is hard to get out of because it’s so attractive and secondly, it has tended to distort the way library collections have grown. Larger proportions of library budgets have been spent on journals and a smaller proportion on books or on new and emerging media.</p>
<p>The second response that publishers have made is to develop conventional journals published on an open access rather than a subscription model. </p>
<p>The most successful example so far is a firm called <a href="http://www.biomedcentral.com/">BioMed Central</a>, which was spun off from a conventional publishing house. They have since been acquired by <a href="http://www.springer.com/?SGWID=0-102-0-0-0">Springer</a>, the second or third largest conventional publisher but the open access model has remained in place. </p>
<p>The fact that Springer bought BioMed Central was significant because it reinforces the impression that Springer is interested in flipping the business model, if they can, and moving in an orderly way to an open access model, wherein the author pays the cost of publication of the article, with the understanding that the article is then free for anyone to consume.</p>
<p>That can be hard for individual academics to do because it can be $2000 or $3000 per article, depending on the journal. There are some instances of universities or grant funding bodies offering subsidies to ease the pain for the individual academic but these remain the exception rather than the rule.</p>
<p><strong>But if traditional publishers are looking at moving toward open access, how can they stay profitable as a business?</strong></p>
<p>Well, quite. From my perspective as a consumer of their output, if you look at the way their portfolio has changed in terms of subscription journals that have developed an open access stream or new titles they are developing which are completely open access, it’s clear that they at least want to test the possibilities and limits of open access as a model. </p>
<p>If open access is going to be the dominant form, they want to place themselves advantageously to do that. But clearly it’s tremendously difficult to carefully engineer a complete transition with no disruption to existing businesses.</p>
<p>That’s the bind that publishers are in. They are trapped by their own success. They are trapped by the success of the prevailing paradigm.</p>
<p><strong>If 65% of journals allow academics to publish at their own institutional repositories, presumably that means the other 35% do not allow that?</strong></p>
<p>That’s right. 35% don’t allow it.</p>
<p>And as for the other 65% of publishers that do allow it, that doesn’t mean in 65% of cases academics are taking advantage of this. Not at all. There are too many existing work flow impediments in the way to make that possible.</p>
<p>That’s why some institutions mandate it. At a senior level in the institutions, they think it’s important in terms of meeting the institution’s mission by achieving the widest possible dissemination of their scholarly output.</p>
<p><strong>How does copyright law apply in the case of published research papers?</strong></p>
<p>By convention, the copyright belongs to the individual academic, not the academic institution. So it’s the academic’s copyright that is being given away when the paper is accepted by a publisher, not the university’s.</p>
<p>That’s almost the problem. For an individual, it’s either all too easy to fill in a form to cede copyright ownership [to the publisher] or else it’s simply too intimidating or laborious to take on the notion of challenging it or negotiating around it.</p>
<p>In fact, you can negotiate certain rights and don’t have to give away copyright in toto. It can depend on the individual and on the publisher. </p>
<p>I suspect most researchers don’t think about the implication of giving away copyright. But it’s a limitation of what they can then do with their own work. It’s an exclusive arrangement.</p>
<p>It might limit what they can do with their own work in terms of distributing to every student in the class they are teaching or putting a copy of the published work on their own website. They may be able to do that with a pre-print but they may really want to do that with the published version and face limitation.</p>
<p><strong>How much consternation in the academic world is there with this issue?</strong></p>
<p>I think at times of crisis it becomes an issue. One shining example of a community-developed open access journal is <a href="http://www.plos.org/">Public Library of Science</a> and it developed, again a decade ago, at a time when the conventional publishing model truly seemed to be at breaking point. In Australia, this crisis was made worse because the Aussie dollar was sinking like a stone at the time and we were facing massive journal cancellations.</p>
<p>In times of crisis, it really impacts on the academic and you see people starting to engage with the idea of open access. If things are humming along nicely, then it tends to recede a bit. It’s certainly an issue for librarians who see the budgetary implications first hand.</p>
<p>The journal system works well, in a way, because it’s quite transparent as to what success means and looks like. It means being published in top ranked journals that have enjoyed global distribution. It’s a benchmark that everyone understands.</p>
<p><strong>What about access by developing countries that can’t afford journal subscriptions?</strong></p>
<p>It’s not that the publishers are bad guys and don’t have a conscience. They do actually have two or three initiatives going, one of which is called <a href="http://extranet.who.int/hinari/en/journals.php">HINARI</a>, a<a href="http://www.who.int/hinari/en/"> World Health Organistion brokered initiative</a> where third world countries can get access for free to the online collections of 250 publishers, including the major publishers like <a href="http://www.elsevier.com/wps/find/homepage.cws_home">Elsevier</a>, Springer and <a href="http://www.wiley.com/WileyCDA/">Wiley</a>.</p>
<p>Countries like <a href="http://theconversation.com/how-academic-journals-price-out-developing-countries-2484">South Africa</a> that are betwixt and between might not be able to take advantage of this, but it does mean that formally defined developing countries do enjoy some access.</p>
<p><strong>So what does the future hold? Will the open access or the traditional subscription model win out?</strong></p>
<p>I wish I knew. With the advent of the web we are inured to very rapid change in business models, so it’s interesting that the journal publication business has been so relatively resistant to change. I think that’s a function of the prevailing academic culture, by which I mean the importance to academics of publication in recognised brand name journals in terms of promotion, tenure, winning grant research proposals.</p>
<p>A lot of the academic’s life is tied up in getting published in those high-impact journals and those journals are published by conventional subscription publishers.</p>
<p>For as long as the incentives to publish in those distribution channels remain stronger than the countervailing pressures for open access, then I think we are in for a long period of transition.</p><img src="https://counter.theconversation.com/content/2511/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Stephen Cramond has consulted to academic publishers in the past.</span></em></p>As the cost of accessing academic journal articles increases, a growing number of academic institutions are building publicly accessible databases of scholarly work. But how much of a threat to the traditional…Stephen Cramond, Electronic Content Manager in the Library, The University of MelbourneLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/24842011-07-25T21:13:55Z2011-07-25T21:13:55ZHow academic journals price out developing countries<figure><img src="https://images.theconversation.com/files/2463/original/libraryshot.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How does the high cost of academic journal subscriptions impact the developing world?</span> <span class="attribution"><span class="source">Flickr/Book Aid International</span></span></figcaption></figure><p>Universities libraries in the developed world are struggling to pay academic journal subscription costs – so how can universities in developing countries hope to pay? </p>
<p>In this Q+A, Professor Adam Habib, Deputy Vice-Chancellor Research, Innovation and Advancement at the University of Johannesburg outlines some of the ways a for-profit academic journal model affects developing countries.</p>
<hr>
<p><strong>What sorts of costs are involved for a researcher who wants to get his or her paper published in a top journal?</strong></p>
<p>It differs from journal to journal. In some journals, it may involve a couple of hundred dollars, in others it may be a thousand or two. Researchers sometimes have to pay for this themselves. However, where universities have some resources, these costs are often borne by the university. </p>
<p>This is ironic, is it not? The costs of the research are borne by the university either in the form of subsidising direct research costs, establishing the research infrastructure and paying the salary of the researcher.</p>
<p>Then, the university again has to expend resources, purchasing the research output that its resources enabled the production of in the first place. </p>
<p><strong>Once peer reviewed, the author of research sometimes has to sign over the copyright to the academic journal. What problems does this present?</strong></p>
<p>The most obvious is that the author cannot distribute his research product as widely as he or she may want to because they may be violating copyright law.</p>
<p>The interest of the publishing houses is for someone to buy the article, not to enable its dissemination to the widest audience possible. The primary interest of any researcher is to get his or her research out to the widest possible audience.</p>
<p>The profit motive in the publishing houses therefore acts as a constraint on the dissemination goal. </p>
<p><strong>Once published, what sorts of costs does the university have to bear to get subscription to the journal in which their researcher’s work is published?</strong></p>
<p>The universities pay a fortune subscribing to journals. Any librarian in the developed or developing world will tell you that the costs of academic journals is running way beyond inflation and library budgets.</p>
<p>The net effect is that universities, especially those in the developing world, cannot have access to important journals. The net result is that students suffer, particularly the poorest and the most vulnerable.</p>
<p><strong>What are your criticisms of this system?</strong></p>
<p>Well, it is a completely feudal system.</p>
<p>The costs of the research production are borne by the universities, and as a result, by public monies, in most cases. Then, private companies publish the research, and charge the universities and public institutions for the very research outputs that they paid for. This is effectively the subsidy of the private sector by public money.</p>
<p>There is a myth that this is an example of entrepreneurialism. In my view, all it does is facilitate enrichment at public cost with huge consequences for those most disadvantaged.</p>
<p><strong>Why don’t we often hear academics and universities speaking out on this issue?</strong></p>
<p>Academics are increasingly being heard on this issue. The growth of the open access movement across the developed and developing world is one academic response to the enrichment dynamics inspired by the commercialisation of the academic publishing industry.</p>
<p>But we obviously could have been heard earlier and louder. One reason this may not have happened is because academics could not envision what an alternative model of academic publishing would involve.</p>
<p>Also there has been an assumption in governance and other decision-making circles across the world that the only efficient way of organising activities is by relying on market mechanisms. This crude, un-nuanced perspective did not want to understand the social costs of an unregulated market, with the result that issues of equity and impact on marginalised communities were not worthy considerations in decision-making in this regard.</p>
<p>The academic sector is therefore merely reflecting the prevailing consensus that seem to govern our societies. Only when these assumptions came under attack after the unravelling of the <a href="http://en.wikipedia.org/wiki/Washington_Consensus">Washington Consensus</a>, did academics and others look at alternative models of organising the distribution of academic knowledge. </p>
<p><strong>How is this system affecting universities from developing countries?</strong></p>
<p>It is not fair at all. It impacts negatively on all our societies, but its most devastating consequences have been on the poor and marginalised of our world.</p>
<p>Simply put, students from poor backgrounds in large parts of the developing world will not have access to quality academic journals in their universities. This means that they will not be as well trained, and as a result will not have the same opportunities as the privileged. Is this not a violation of the principle of equal opportunities for all.</p>
<p><strong>What should be done instead?</strong></p>
<p>Well, one alternative would be to fund academic journals directly through the public purse.</p>
<p>The reason that academics resort to the private publishing industry is because of a lack of funds to underwrite the costs of producing their journals. If public monies were made available for this, then they would not be obliged to resort to the commercial sector.</p>
<p>And this need not be new money. After all, it is ultimately public money that underwrites the academic libraries which are the main market for these journals. A simple reorganisation of public funds could then enable the public financing of academic journals.</p>
<p>If this is not realisable in the immediate short term, one could look to regulating the commercial academic publishing industry.</p>
<p>There are already existing models of doing this in other sector. For instance, there are a number of international protocols governing the health care industry to ensure the provision of life saving drugs including the provision of anti-retrovirals for HIV-AIDS sufferers. Could similar protocols not be developed for the commercial academic publishing industry?</p>
<p>After all, higher education is an important public good on which inclusive and equitable development across the world is dependent.</p><img src="https://counter.theconversation.com/content/2484/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Adam Habib does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Universities libraries in the developed world are struggling to pay academic journal subscription costs – so how can universities in developing countries hope to pay? In this Q+A, Professor Adam Habib…Adam Habib, Vice-Chancellor and Principal, University of the WitwatersrandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/19252011-07-25T21:09:33Z2011-07-25T21:09:33ZCracking the code of ethical research practices<figure><img src="https://images.theconversation.com/files/2292/original/research.jpg?ixlib=rb-1.1.0&rect=238%2C150%2C2713%2C1786&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Publishing practices in the biomedical and social sciences commonly fails to conform to Australian codes of practice. </span> <span class="attribution"><span class="source">AAP</span></span></figcaption></figure><p>In February, Minister for Innovation, Industry, Science and Research Kim Carr said Australian taxpayers could be confident the research activities they fund meet the “highest ethical and moral standards”. </p>
<p>Why was the Minister so confident? Because the Gillard Labour Government has established an Australian Research Integrity Committee (ARIC). </p>
<p>The establishment of the Committee was welcome news. For too long, Australia has pretended that our research practice is sound. </p>
<p>In recent years, several high profile cases acted momentarily to shake this confidence: allegations of misconduct, including guest authorship practices, against UNSW Professor Bruce Hall in 2002; the resignation of Monash Professor David Robinson after accusations of plagiarism; and allegations that a Sydney endocrinologist might be one of many who fronted journal articles ghost authored by Wyeth pharmaceuticals. </p>
<p>Only one of these allegations was ultimately upheld: however, a qualitative study (published in February 2010 in Social Science and Medicine) carried out in two Australian universities, but reflecting a wider experience, would suggest that there are systemic problems at least in the publication of research, as does anecdotal report from research leaders at the 2011 Quality in Postgraduate Research Conference. </p>
<p>We would suggest that publishing practice, particularly with respect to authorship, which does not conform with the Australian Code of Responsible Conduct of Research is common in the biomedical and social sciences in Australia, although this is far from a uniquely Australian issue: similar behaviour has been recorded internationally. </p>
<p>Specific cases of misconduct which have reached public scrutiny have been few and those that have done so have been through media publicity. </p>
<p>Whistleblowers do poorly and it is possible that most researchers walk away from such confrontations but it is also not in the interest of institutions to publicise such issues. </p>
<p>Judging the state of research ethics in Australia on the basis of identifiable cases is almost certainly illusory but it is difficult to gauge the extent of the problem without better evidence. </p>
<p>For some years now, researchers and institutions have been under pressure to ‘perform’ and to be accountable for the public funding provided to them for research.</p>
<p>Performance measurement, in many disciplinary areas, is built around assessment of ‘quality’ of journals, the number of publications, and the citation rate of those publications relative to the ‘norm’ within the discipline. </p>
<p>The introduction of the research quality assessment framework, Excellence in Research in Australia (including recent amendments to it) has shifted emphasis particularly to the last of these and reinforced the need for authors to pursue strategic publishing practices to maximise bibliometric success.</p>
<p>The ways in which research performance is measured can undermine ethical behaviour in research publication. </p>
<p>At the very least, in the present ERA climate, there are incentives to pursue strategic publishing practices which shift the focus of publication away from research dissemination and towards bibliometric factors.</p>
<p>In particular, there are incentives under the ERA framework to maximise the number of citations. Self-citation is counted in the analysis, so that for researchers and institutions to self-cite would seem sensible as would reciprocal ‘cite-your-mate’ approaches.</p>
<p>It is possible that coalitions may form to cross-cite papers. The Australian Research Council has already considered this matter and ruled that self-citation and ‘cite-your-mate’ practices are not significant enough to influence overall citation outcomes and therefore will not be excluded, although it is not clear on what basis this decision was made. </p>
<p>These measures are part of the ‘game’ and each time the ‘game’ changes in terms of the expectations of the performance framework, researchers scramble to adapt. </p>
<p>Kim Carr’s recent Ministerial Statement acknowledged this adaptive practice and that the use of journal rankings within the ERA had acted to distort publication practice.</p>
<p>The introduction of the ERA seems unlikely to change the norms of behaviour in authorship practice. Without effective intervention strategies, authorship will continue to be used as a bartering chip in the ‘commerce’ of research and in many cases will continue to reflect the exercise of ambition and power differentials in academia. </p>
<p>Distortions in publication practice such as citation gaming and inappropriate attribution of authorship are part of a continuum of ‘normal misbehaviour’ which, at its extremes, include fabrication, falsification and plagiarism. </p>
<p>The Australian Code for the Responsible Conduct of Research has been an attempt to regulate some of the more egregious excesses and it does provide disempowered junior researchers with some leverage in publication conundrums. </p>
<p>Yet, monitoring and enforcement is rare and, four years on from publication of the Code, many institutions have yet to comply fully, particularly with respect to staff ethics training, ethical authorship practice and storage of research data. </p>
<p>The Code is, in part, an acknowledgement that there are considerable shortcomings to any system which places pressure on researchers to perform and sets up output targets when there are clear imbalances in power and opportunities for individuals and institutions to misuse that power. </p>
<p>Carr has suggested that it is an institutional responsibility to regulate and inspire: that institutions “must provide the education to ensure their staff behave ethically and do not foster ‘negative research cultures”. </p>
<p>It is self-evident that we need a strong system of research governance to counteract the pressures and perverse incentives placed on researchers but where is the incentive for researchers and institutions to act?</p>
<p>Beyond the Code we need a supported dialogue about the ways in which institutions can work to create cultures of integrity and it is possible that the Code itself needs further work. </p>
<p>For example, there may be room for auditing systems to support good practice, probably with associated incentives and sanctions. </p>
<p>Good practice could also be supported through independent institutional advisors in research ethics, improved researcher training in research ethics and strong statements about unacceptable research practice. </p>
<p>The Australian Research Integrity Committee can provide an avenue for the prosecution of egregious unethical behaviour, however, Mr Carr should understand that it cannot guarantee that the research activities the public funds “meet the highest ethical and moral standards”. </p>
<p>We will need to do far more before we can say that.</p><img src="https://counter.theconversation.com/content/1925/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jackie Street is currently funded by the NHMRC and has received funding from the NHMRC and ARC in the past. </span></em></p><p class="fine-print"><em><span>Annette Joy Braunack-Mayer receives funding from the Australian Research Council and the National Health and Medical Research Council. </span></em></p>In February, Minister for Innovation, Industry, Science and Research Kim Carr said Australian taxpayers could be confident the research activities they fund meet the “highest ethical and moral standards…Jackie Street, Senior lecturer, University of AdelaideAnnette Joy Braunack-Mayer, Professor in Ethics, University of AdelaideLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/24752011-07-24T21:13:46Z2011-07-24T21:13:46ZPutting a price on knowledge: the high cost of academic journals<figure><img src="https://images.theconversation.com/files/2454/original/journals.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Does the cost of academic journals stymie learning?</span> <span class="attribution"><span class="source">Flickr/the.Firebottle</span></span></figcaption></figure><p>The phrase ‘publish or perish’ is familiar to all academics, who face enormous pressure to have their work featured in the top academic journals. Career progression, job security and pay rises can depend on getting a by-line in the ‘right’ journal.</p>
<p>Journals do society a great service by seeking out, editing, improving upon and publishing the top scholarly work, thereby contributing to the global pool of knowledge.</p>
<p>But some publishers charge academics high ‘article processing fees’, obtain the copyright license for the resulting article and then bill academic institutions thousands of dollars for subscriptions so they can access their own researcher’s work.</p>
<p>Frustration with the current system has prompted one protester to <a href="http://theconversation.com/more-than-18-000-journal-articles-leaked-online-to-protest-data-theft-arrest-2467">post more than 18,000 journal articles to the illegal file sharing site The Pirate Bay</a>.</p>
<p>Several academic institutions have developed open access models, where scholarly work is kept on databases that can be accessed for free. Working with some publishers, the <a href="http://libraries.mit.edu/sites/scholarly/mit-open-access/open-access-at-mit/mit-open-access-policy/mit-faculty-open-access-policy-faq/">Massachusetts Institute of Technology has made many hundreds of articles available on its DSpace@MIT database</a>.</p>
<p>In this Q+A, Professor Simon Marginson from the Centre for the Study of Higher Education at the University of Melbourne outlines some of the issues in this debate.</p>
<hr>
<p><strong>What concerns you, if anything, about the way the academic journal system works at present? What are the benefits versus the downsides?</strong></p>
<p>I’m not sure how widespread is the practice of making authors pay to make submissions or get reviewed for publication. It does not happen in any of the social science and humanities journals I know – I am on 15 editorial boards.</p>
<p>It is a perversion of the process – intellectual merit and originality should be the only basis for inclusion. It happens in part because in some disciplines the most authoritative knowledge is centred on a very small number of journals.</p>
<p>This can be quite conservatising, marginalising ‘off the wall’ innovations which often have to come from outside the most prestigious journals and sometimes happen in the open source / open access publishing world.</p>
<p>Of course, the system of payment for reviews and so on favours established scholar-researchers (who get their institutions to pay all costs) and disadvantages young scholars in richer countries who lack ‘pull’ inside the universities, and disadvantages all institutions and people in the developing countries.</p>
<p>The high cost of journal subscriptions also disadvantages institutions and people in the developing world – given the importance of knowledge in economic development and cultural power, unequal access to worldwide knowledge as published in journals is a primary factor in reproducing global inequalities.</p>
<p><strong>What can be done to tackle those problems?</strong></p>
<p>Firstly, foreign aid for research and education should extend to programs for sharing journal subscriptions.</p>
<p>Secondly, a serious problem is the centralisation of the formal disciplinary journals in a small number of major publishing companies, for whom turning the circulation of knowledge into a mini-economy with artificial scarcity and cost barriers makes good sense as a business model.</p>
<p>However it is a bad model for cultural circulation. As <a href="http://en.wikipedia.org/wiki/Joseph_Stiglitz">Joseph Stiglitz</a> pointed out, knowledge is a global public good, non-rivalrous and non-excludable once the moment of first discovery is passed.</p>
<p>The widest possible availability is a optimising condition of both knowledge formation itself and of the innovation economy. Any barriers to publishing and receiving are artificial – knowledge is naturally ‘open source’ and flows freely. So vigorous participation in open source or open access publishing is the major antidote to the high cost journal regimes.</p>
<p>One advantage that open source publishing has is immediacy – there is usually a long lag time for publication in the major journals. Two or three year delays are common. </p>
<p><strong>How well are universities coping with the cost of journal subscriptions?</strong></p>
<p>Few universities can afford to maintain the full set of minimum necessary journals to be able to provide research infrastructure on a comprehensive basis. Indeed, even the strongest Australian university libraries are forced to do without material they need to hold.</p>
<p>In New Zealand the problem is significantly worse, and in major universities in such countries as Indonesia, Philippines or Vietnam there is simply no possibility of providing even the most minimum set of necessary journals. </p>
<p><strong>What do you think of the open access models developing here and overseas?</strong></p>
<p>That’s the way to go, but open access publishing should be in two forms - (1) papers validated by a journal board and review process, (2) anything goes publishing. And the two forms need to be carefully distinguished.</p><img src="https://counter.theconversation.com/content/2475/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Simon Marginson receives funding from the Australian Research Council Discovery Projects program.</span></em></p>The phrase ‘publish or perish’ is familiar to all academics, who face enormous pressure to have their work featured in the top academic journals. Career progression, job security and pay rises can depend…Simon Marginson, Professor of Higher Education, The University of MelbourneLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/18742011-07-11T03:26:24Z2011-07-11T03:26:24ZWhy the ERA had to change and what we should do next<figure><img src="https://images.theconversation.com/files/2169/original/kimcarr.jpg?ixlib=rb-1.1.0&rect=77%2C212%2C2716%2C2053&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Kim Carr has canned the journal rankings system: but what should be next? </span> <span class="attribution"><span class="source">AAP</span></span></figcaption></figure><p>There was much celebrating around Australia’s university campuses when the Minister for Innovation, Industry, Science and Research, Senator Kim Carr announced changes to the “Excellence in Research for Australia” (ERA) scheme. </p>
<p>A key decision was the removal of the prescriptive ranking system of A*, A, B and C level journals. In its place is to be a more complex measure of journal quality, with greater recognition of multi-disciplinary research.</p>
<p>In his media statement, Carr noted: “there is clear and consistent evidence that the rankings were being deployed inappropriately within some quarters of the sector, in ways that could produce harmful outcomes, and based on a poor understanding of the actual role of the rankings. One common example was the setting of targets for publication in A and A* journals by institutional research managers”.</p>
<p>The change should come as no surprise. Previous Federal Government attempts to measure research quality were also problematic. </p>
<p>That scheme attempted to measure quantity, quality and impact and as with the ERA, it cost millions of dollars and thousands of hours of time as Australia’s universities scrambled to get the best score.</p>
<p>Major problems with the ERA were its narrow focus on journal rankings as a measure of academic performance and a lack of transparency about how some journals were ranked. </p>
<p><a href="http://scienceandtechnologyaustralia.org.au/about-science-australia/">Science and Technology Australia</a> (formerly the Federation of Australian Scientific and Technological Societies) warned in 2010 that the ERA suffered from inconsistencies in the way journals were ranked across disciplines. </p>
<p>While some fields regarded multidisciplinary journals highly, others did not. Some academic fields had more A and A* journals than others, while new and emerging fields seemed disadvantaged.</p>
<p>Multidisciplinary and interdisciplinary research is now recognised as the frontier for innovation around the world. However, the ERA worked against this trend and disadvantaged scholars publishing outside their immediate discipline areas. </p>
<p>The Fields of Research codes used by the ERA were often a poor reflection of researchers’ locations within universities, with many spanning disciplinary fields. </p>
<p>While the science disciplines relied on impact factors when assessing journals, social sciences and the humanities did not, due to the poor coverage of non-science journals in benchmark systems, such as the Institute of Scientific Investigation. </p>
<p>Arguably a more complete coverage of academic citations can be found in Google Scholar. The ERA also gave little weight to Australian based journals and ignored research books. </p>
<p>This disadvantaged fields such as Australian history, literature and industrial relations, which led the Deans of Arts, Social Sciences and Humanities to question the merits of the ERA as far back as 2009.</p>
<p>There has been a quest for international benchmarks in universities for many years. This reflects the globalisation of the sector over the past thirty years.</p>
<p>It is also a response by governments to public pressure to see what return is being obtained from public funding of university research. As a result the sector has become obsessed with league tables, often to the detriment of appropriate activity. </p>
<p>An unfortunate outcome of the ERA was the way it was presented within the media and interpreted by the sector.</p>
<p>At an institutional level, the ERA suggested 70% of Australia’s universities were performing below world’s best practice in research. </p>
<p>This did little to assist Australia’s embattled international education sector, which is our third most important export industry, although times have been hard recently. </p>
<p>It would be interesting to see the reaction if the Government suggested 70% of Australia’s tourist destinations were not up to standard.</p>
<p>In a review of the British University Quality Assessment system in the late 1980s, one official report noted: </p>
<p>“No one has yet devised even a single indicator of performance measurement, which commands wide support amongst the academic community… and those using performance measures, whether they refer to teaching or research activities, should use them with great caution and considerable humility”. </p>
<p>Little has changed since. Britain’s <a href="http://www.qualityresearchinternational.com/glossary/rae.htm">Research Assessment Exercise</a> remains as controversial as the ERA. </p>
<p>The modern academic must be a good researcher and teacher, but is also expected to engage with the community. </p>
<p>Like the universities in which they work, they serve multiple stakeholders. A paper published in an A* journal may reflect good scientific method or leading-edge theory, but academic impact is not only measured by peer citations, as a number of recent Australians of the Year can attest.</p>
<p>If the ERA or its successor is to have a future it must provide a more balanced measure of research performance. A “one jacket fits all” approach does not work, nor does a narrow focus on ranked journals. </p>
<p>Greater transparency and recognition of interdisciplinary and multidisciplinary work is needed. Finally, it is clear some will seek to “game the system”. </p>
<p>Consequently, it is not sensible to introduce institutional-level performance measures that do not align with individual-level performance measures and this inconsistency needs to be addressed before the next ERA round.</p><img src="https://counter.theconversation.com/content/1874/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tim Mazzarol receives funding from the ARC.</span></em></p><p class="fine-print"><em><span>Geoffrey N. Soutar receives funding from the ARC</span></em></p>There was much celebrating around Australia’s university campuses when the Minister for Innovation, Industry, Science and Research, Senator Kim Carr announced changes to the “Excellence in Research for…Tim Mazzarol, Winthrop Professor, Entrepreneurship, Innovation, Marketing and Strategy , The University of Western AustraliaGeoffrey Norman Soutar, Professor, The University of Western AustraliaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/13112011-06-23T21:04:12Z2011-06-23T21:04:12ZHow ‘publish or perish’ is ruining finance education<figure><img src="https://images.theconversation.com/files/1830/original/typing.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">When is comes to research, it seems quantity has become much more important than quality.</span> <span class="attribution"><span class="source">Flickr/Iscan</span></span></figcaption></figure><p>Imagine the following conversation between a finance academic and his or her supervisor during an annual performance review:</p>
<p>Academic: So, do you think I am ready for a promotion?
Supervisor: Well, I see that your teaching is very good, service is excellent, but….
Academic: But?
Supervisor: You haven’t got the numbers yet?
Academic: Numbers?
Supervisor: Yes, the required number of papers!
Academic: How many?
Supervisor: Oh, just a few more.</p>
<p>The supervisor is not necessarily a bad person, but is merely following the rules. Or should we say, applying the rules. </p>
<p>The rules say a certain number of academic papers is necessary for promotions. But, herein lies a dilemma – there is really no certainty about that number. </p>
<p>There are rarely any written policies at universities on how many papers are required for promotions, so the goal posts can keep moving. </p>
<p>More interestingly, if you are able to churn papers but have average or poor teaching record and little service to the university and community, you could become a professor very quickly. “Professor of what?”, you may ask. For many academics it seems, it doesn’t really matter. </p>
<p>Welcome to the world of publish or perish. And be careful, the “rules” may be literally applied. After all, we’ve got to keep the pot boiling!</p>
<h2>What to publish?</h2>
<p>So, if you are a finance academic, what kind of articles should you be publishing? </p>
<p>You’ve got to write about markets, asset pricing, risk management, and so on. More importantly, you have to make some predictions. How will markets perform? How can we manage the risks of investing?</p>
<p>It is fine if your predictions, findings, propositions have not been tested or have been proved to be wrong. You can just keep publishing, and keep making more predictions and propositions. </p>
<p>The journals will publish them anyway. After all, how would they survive without a constant flow of articles?</p>
<p>The most important question here is: how does a finance academic’s research benefit society? If only the quantity of research is seen as important, than it does not matter at all. </p>
<p>Moreover, your research may not even benefit you – it will be useless if you can’t even apply your research to your own teaching.</p>
<p>So, dear finance colleagues, don’t worry about the practical value of your research.</p>
<h2>Getting the numbers</h2>
<p>Here are some practical tips on boosting your research output: </p>
<ol>
<li><p>Do not delve too much into one area. You might learn a little that markets are not really efficient and gain some respectability and authority in the area, but the numbers will be very slow in coming. Your colleagues will leave you behind.</p></li>
<li><p>Join your colleagues in whatever project they are working on. It does not necessarily have to be in your area or something that interests you. It’ll be worth the effort you will get another published piece.</p></li>
<li><p>Even better, get your name associated to a project without any real input. It is OK even if you are fifth in line on the author list.</p></li>
<li><p>And, do not worry about the practical usefulness of the theories, models and of your research to society. There’s no time to be ethical.</p></li>
</ol>
<h2>Teaching by numbers</h2>
<p>Our obsession with numbers extends beyond just research output. Take the case of a class attendance. On one hand, universities are increasing their online presence.</p>
<p>A colleague remarked the other day that these days you can complete a university program from your bedroom. It sounds a bit like Internet banking.</p>
<p>On the other hand, we want as many students as possible to attend lectures. Even if the lecture is on a Friday at 8pm, we’re hungry for the numbers. </p>
<p>The student failure rate is also important. You can’t have everybody passing, so you need to fail some to satisfy a pre-determined percentage.</p>
<p>Likewise, the distribution of grades is important. You can’t award too many high distinctions, or else you might get the numbers wrong.</p>
<p>And let’s not forget about meetings. What happens in a meeting or afterwards is less important than, you guessed it, how many people attended.</p>
<p>Then you can go around telling others about attendance, which is of course the most important agenda item. </p>
<p>Social functions are the same. Who turned up, who did not, is a key question. Take the annual Christmas party, for instance. No, you can’t decide not to socialise. You have to attend, even if you hate it.</p>
<p>You also have to attend the odd research seminar, but it’s ok if you don’t say a word and don’t know anything about the presentation.</p>
<p>Again the relevance of the seminar is immaterial. What is more important is who asked the questions, how many questions were asked, who understands econometrics better, how many attended. </p>
<p>But there are some questions you should not ask, including what is the relevance of the research for society or the business world? What are implications for the future.</p>
<p>If you attend the seminars, meetings and so on, you make a claim in your performance review – that you are a good corporate citizen.</p>
<p>You can say that you have attended four staff meetings, two graduation ceremonies, one Christmas party, and six research seminars. If you don’t attend these and your paper numbers are not up to scratch, then you could be in trouble. </p>
<p>The moral of the story is: show your face, talk to a few people, and make sure they notice you.</p>
<h2>Padding out the figures</h2>
<p>When you do produce a research paper in finance, make sure it has lots of equations, models, even, lots of tables and diagrams. </p>
<p>Don’t worry that Paul Krugman, the famous US economist, said that equations and diagrams should be no more than just scaffolding used to construct an intellectual edifice. Removing all these leaves only plain English behind.</p>
<p>Also, the more the references the better. Cite anybody who who has done similar work.</p>
<p>And a tip on referencing: no need to read the literature that is to be included in your list – just get the list from another paper.</p>
<p>Just get the numbers, keep the pot boiling and publish or perish!</p><img src="https://counter.theconversation.com/content/1311/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Parmendra Sharma does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Imagine the following conversation between a finance academic and his or her supervisor during an annual performance review: Academic: So, do you think I am ready for a promotion? Supervisor: Well, I see…Parmendra Sharma, Lecturer, Department of Accounting, Finance and Economics, Griffith UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/9382011-05-23T04:41:31Z2011-05-23T04:41:31ZIn universities obsessed with research here’s what falls between the cracks<figure><img src="https://images.theconversation.com/files/695/original/Volunteer_fireys_Rob_Down_Under_Flickr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The university funding system discourages research on volunteers like these men who are risking their lives to help their community.</span> <span class="attribution"><span class="source">Flickr/Rob Down Under</span></span></figcaption></figure><p>In Australian universities at the moment research is everything. They obsess over the rankings in the new ERA system which measures research performance. For academics publishing in the top journals isn’t just part of playing the game, it’s the whole game. </p>
<p>But there are things that the rankings don’t tell you, and valuable work that research-obsessed university administrators currently don’t recognise. </p>
<p>It’s another example of the measurement system changing the ways people behave, for the worse. And the unintended consequences of this unhealthy research obsession are holding us back.</p>
<h2>It’s the research, stupid</h2>
<p>The accepted way to measure academic performance has become research output. Excellence in Research for Australia (ERA), the government funding agency for research, distributes the resources which shape the way Australia presents its knowledge to the world. </p>
<p>Getting a slice of the $510 million Sustainable Research Excellence program has become the holy grail for many university administrators. But it ignores the hard work being done teaching the next generation.</p>
<p>There is already evidence that research assessment exercises overseas have amplified the swing to more research emphasis in promotion policies. Take this report from Alan Jenkins and Graham Gibbs in the Guardian on 15th August 1995.</p>
<p>“A survey conducted by the Oxford Centre for Staff Development in the UK showed that while 96% of institutions included excellence in teaching amongst promotion criteria, only 11% of promotion decisions were made on teaching grounds. Another 38% of institutions reported never having promoted someone primarily on the grounds of teaching excellence. Written responses included ‘Not in living memory’, and the wonderfully disdainful ‘Not at this institution’. One respondent reported: `There are three criteria for appointment here, research, research and research’”</p>
<p>In practice this may mean further concentrating research in a small number of institutions and perhaps the emergence of “teaching-only” departments or even universities. </p>
<p>Given the federal government’s commitment to increase significantly the participation rate of school leavers in higher education, many of these students may end up in research poor environments. </p>
<p>We risk creating a strange mixture of elitism and egalitarianism. </p>
<p>Universities will be able to use ERA performance to guide the allocation of resources as well as invest in future skills. </p>
<h2>How important is your journal? </h2>
<p>Many people spend their lives getting <em>that</em> paper into <em>that</em> journal. Name dropping matters. To be taken seriously, and enjoy the funding benefits, you have to be published in the key journals in your field. Risk taking is avoided. To get published you have to cite those who have trodden that path before.</p>
<p>But the <a href="http://www.arc.gov.au/era/journal_list_dev.htm">2010 journal rankings</a> seriously devalue various interdisciplinary research fields and could damage Australia’s strong international reputation in these fields.</p>
<p>Interdisciplinary research is often where the breakthroughs come. </p>
<p>So where does that leave those whose work is innovative? Multidisciplinary researchers are literally thinking outside of their academic box, and they’re being penalised. </p>
<p>In areas like the humanities, arts and social sciences, it’s tricky to assess the quality of research in the way that ERA requires. Its way of examining citation lists is not well attuned to measuring interdisciplinary research and cross-sector collaboration.</p>
<p>And that’s not all. International journals are favoured over local ones but they are not necessarily interested in publishing Australia-specific research. Thus, under the ERA, we applaud someone for publishing in an international journal, rather than recognise efforts to contribute knowledge and participate in debates in Australia. </p>
<p>A case in point is the potential effects on research on the not-for-profit sector, where journals with lower impact measures, circulation and much higher acceptance rates are outranking the most longstanding and prestigious international journals in the field.</p>
<h2>Writing off the not-for-profit sector</h2>
<figure class="align-left ">
<img alt="" src="https://images.theconversation.com/files/726/original/School_sport.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/726/original/School_sport.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=424&fit=crop&dpr=1 600w, https://images.theconversation.com/files/726/original/School_sport.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=424&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/726/original/School_sport.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=424&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/726/original/School_sport.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=533&fit=crop&dpr=1 754w, https://images.theconversation.com/files/726/original/School_sport.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=533&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/726/original/School_sport.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=533&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">86% of Australian adults work, volunteer or are a member of a not-for-profit organisation. Yet the role of these organisations are in danger of being ignored by the academic community.</span>
<span class="attribution"><span class="source">AAP/Julian Smith</span></span>
</figcaption>
</figure>
<p>The limitations of ERA in measuring the quality of multidisciplinary research is glaringly obvious in the not-for-profit studies. </p>
<p>It’s a rapidly growing area of research and is at the centre of many contemporary public policy debates.</p>
<p>According to the <a href="http://www.abs.gov.au/Ausstats/abs@.nsf/0/C068946BDCA09FAFCA25749B0017A3D4?OpenDocument">2006/07 ABS data</a>, the Australian not-for-profit sector: </p>
<p>• Had 41,000 economically significant not-for-profit organisations.</p>
<p>• Employed 890,000 people, 8.6% of Australians in employment.</p>
<p>• Had an income of $76 billion.</p>
<p>• Contributed $34 billion, or 3.4%, to GDP. </p>
<p>• Made an economic contribution equivalent to that of the government administration and defence industry and one and a half times the economic contribution of the agriculture industry. </p>
<p>• Had over 13 million Australians (86% of adults) belonging to at least one not-for-profit association, with 48% belonging to at least three.</p>
<p>• Had just under 1 million Australians holding office in a not-for-profit organisation.</p>
<p>But if there is no one in Australia researching not-for-profit or volunteering, it is highly likely that there will be no undergraduate, or even postgraduate courses on these subjects. This potentially under serves 800,000 paid employees and over 6 million volunteers – Australia’s largest workforce!</p>
<h2>The peer review process - an imperfect science</h2>
<p>The panels reviewing journals could be more transparent about how they actually operated and what was done with their recommendations.</p>
<p>In the case of not-for-profit research, all submissions were made through the Australian Business Deans Council (ABDC) joint submission. They ranked the not-for-profit journals appropriately on their own list but they were all then downgraded in the final ERA list. </p>
<p>It is difficult to know what happened. Maybe the ABCD’s advice was rejected or maybe they didn’t defend not-for-profit research strongly enough. I hope the current round of consultation will provide greater access to those groups that might better represent those engaged in multidisciplinary research.</p>
<h2>Why fields of research codes don’t tell the whole story</h2>
<p>If your work falls neatly into a Field of Research code (FoR) you’re in luck. ERA can easily identify what you’re doing. But if you collaborate outside the strict parameters of the FoR and end up with various four-digit research fields a valued ERA score is less easy to reach. This strengthens disciplinary “silos” while multidisciplinary approaches become invisible.</p>
<p>This was the case for non-profit studies. It has no distinct FoR and internationally esteemed multidisciplinary not-for-profit journals were outranked by less prestigious, subject specific and far less cited marketing, management and accounting journals.</p>
<p>A way forward would be to allocate not-for-profit research a FoR code, in much the same way as the ARC has given other multidisciplinary areas a code, such as tourism studies (which has its own category but only employs half the number of workers of the not-for-profit sector). </p>
<p>Other countries have Fields of Research or equivalent that separately identify not-for-profit and voluntary sector studies, so there is a strong case for international compatibility.</p>
<h2>What will happen if we don’t take action? </h2>
<p>Teaching and research are interdependent. Research productivity significantly adds to both the quality and substance of classroom teaching and teaching adds to the quality of research, not least because it allows for the (often valuable) input of students. </p>
<p>The ERA ratings say nothing about teaching excellence. For this reason it is likely that the ERA will further intensify the research culture in many university departments, probably at the expense of teaching. </p>
<p>The division created within departments between researchers and teachers can lead to them being unable to function as communities of scholars. Instead they become the setting of game playing for some and the home of resentment and bitterness for others.</p>
<p>By devaluing the top journals in multidisciplinary fields of research, we are on a path that leads us away from accepted international best practice – just when we need more than ever to ensure that our researchers have international standing. </p>
<p>It is time not-for-profit sector researchers call on the ARC to revise the 2010 Journal Ranking list to recognise this important field of research.</p><img src="https://counter.theconversation.com/content/938/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Bronwen Dalton has received funding from the ARC.</span></em></p>In Australian universities at the moment research is everything. They obsess over the rankings in the new ERA system which measures research performance. For academics publishing in the top journals isn’t…Bronwen Dalton, Associate professor, Management Discipline Group UTS Business School, University of Technology SydneyLicensed as Creative Commons – attribution, no derivatives.