tag:theconversation.com,2011:/fr/topics/scientific-publishing-5687/articlesScientific publishing – The Conversation2024-03-27T10:19:44Ztag:theconversation.com,2011:article/2261982024-03-27T10:19:44Z2024-03-27T10:19:44ZEnglish dominates scientific research – here’s how we can fix it, and why it matters<figure><img src="https://images.theconversation.com/files/582900/original/file-20240308-30-6nsuxr.jpg?ixlib=rb-1.1.0&rect=10%2C10%2C6718%2C5049&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption"></span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/es/image-illustration/select-language-learning-translate-languages-audio-525271720">Shutterstock / Maxx-Studio</a></span></figcaption></figure><p>It is often remarked that Spanish should be more widely spoken or understood in the scientific community given its number of speakers around the world, a figure the Instituto Cervantes places at <a href="https://cvc.cervantes.es/lengua/anuario/anuario_23/">almost 600 million</a>. </p>
<p>However, millions of speakers do not necessarily grant a language strength in academia. This has to be cultivated on a scientific, political and cultural level, with sustained efforts from many institutions and specialists.</p>
<h2>The scientific community should communicate in as many languages as possible</h2>
<p>By some estimates, as much as <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0238372">98% of the world’s scientific research is published in English</a>, while only around <a href="https://www.ethnologue.com/insights/ethnologue200/">18% of the world’s population</a> speaks it. This makes it essential to publish in other languages if we are to bring scientific research to society at large.</p>
<p>The value of multilingualism in science has been highlighted by numerous high profile organisations, with public declarations and statements on the matter from the <a href="https://euraxess.ec.europa.eu/sites/default/files/am509774cee_en_e4.pdf">European Charter for Researchers</a>, the <a href="https://www.helsinki-initiative.org/en">Helsinki Initiative on Multiligualism</a>, the <a href="https://unesdoc.unesco.org/ark:/48223/pf0000379949">Unesco Recommendation on Open Science</a>, the <a href="https://operas-eu.org/special-interest-group-living-book/operas-multilingualism-white-paper-june-2021/">OPERAS Multiligualism White Paper</a>, the <a href="https://www.clacso.org/declaracion-de-principios-del-foro-latinoamericano-de-evaluacion-cientifica-folec/">Latin American Forum on Research Assessment</a>, the <a href="https://coara.eu/agreement/the-agreement-full-text/">COARA Agreement on Reforming Research Assessment</a>, and the <a href="https://digital.csic.es/bitstream/10261/284851/1/Conclusiones%20y%20recomendaciones%20del%20grupo%20de%20trabajo%20sobre%20publicaciones%20cienti%20ficas-%20VF%20ES-1.pdf">Declaration of the 5th Meeting of Minsters and Scientific Authorities of Ibero-American Countries</a>. These organisations all agree on one thing: all languages have value in scientific communication.</p>
<p>As <a href="https://digital.csic.es/bitstream/10261/284851/1/Conclusiones%20y%20recomendaciones%20del%20grupo%20de%20trabajo%20sobre%20publicaciones%20cienti%20ficas-%20VF%20ES-1.pdf">the last of these declarations</a> points out, locally, regionally and nationally relevant research is constantly being published in languages other than English. This research has an economic, social and cultural impact on its surrounding environment, as when scientific knowledge is disseminated it filters through to non-academic professionals, thus creating a broader culture of knowledge sharing. </p>
<p>Greater diversity also enables fluid dialogue among academics who share the same language, or who speak and understand multiple languages. In Ibero-America, for example, Spanish and Portuguese can often be <a href="https://www.jstor.org/stable/343562">mutually understood</a> by non-native speakers, allowing them to share the scientific stage. The same happens in Spain with the majority of its <a href="https://en.wikipedia.org/wiki/Official_languages_of_Spain">co-official languages</a>. </p>
<hr>
<p>
<em>
<strong>
Leer más:
<a href="https://theconversation.com/non-native-english-speaking-scientists-work-much-harder-just-to-keep-up-global-research-reveals-208750">Non-native English speaking scientists work much harder just to keep up, global research reveals</a>
</strong>
</em>
</p>
<hr>
<h2>No hierarchies, no categories</h2>
<p>Too often, scientific research in any language other than English is automatically seen as second tier, with little consideration for the quality of the work itself. </p>
<p>This harmful prejudice ignores the work of those involved, especially in the humanities and social sciences. It also profoundly undermines the global academic community’s ability to share knowledge with society.</p>
<p>By defending and preserving multilingualism, the scientific community brings research closer to those who need it. Failing to pursue this aim means that academia cannot develop or expand its audience. We have to work carefully, systematically and consistently in every language available to us.</p>
<hr>
<p>
<em>
<strong>
Leer más:
<a href="https://theconversation.com/prestigious-journals-make-it-hard-for-scientists-who-dont-speak-english-to-get-published-and-we-all-lose-out-226225">Prestigious journals make it hard for scientists who don't speak English to get published. And we all lose out</a>
</strong>
</em>
</p>
<hr>
<h2>The logistics of strengthening linguistic diversity in science</h2>
<p>Making a language stronger in academia is a complex process. It does not happen spontaneously, and requires careful coordination and planning. Efforts have to come from public and private institutions, the media, and other cultural outlets, as well as from politicians, <a href="https://www.eeas.europa.eu/eeas/what-science-diplomacy_en">science diplomacy</a>, and researchers themselves. </p>
<p>Many of these elements have to work in harmony, as demonstrated by the Spanish National Research Council’s work in <a href="https://pti-esciencia.csic.es/">ES CIENCIA</a>, a project which seeks to unite scientific and and political efforts.</p>
<h2>Academic publishing and AI models: a new challenge</h2>
<p>The global academic environment is changing as a result the digital transition and new models of open access. Research into publishers of scientific content in other languages will be essential to understanding this shift. One thing is clear though: making scientific content produced in a particular language visible and searchable online is crucial to ensuring its strength.</p>
<p>In the case of academic books, <a href="https://elpais.com/opinion/2023-06-16/abrir-los-libros.html">the transition to open access has barely begun</a>, especially in the commercial publishing sector, which releases around 80% of scientific books in Spain. As with online publishing, a clear understanding will make it possible to design policies and models that account for the different ways of disseminating scientific research, including those that communicate locally and in other languages. Greater linguistic diversity in book publishing can also allow us to properly recognise the work done by publishers in sharing research among non-English speakers.</p>
<hr>
<p>
<em>
<strong>
Leer más:
<a href="https://theconversation.com/removing-author-fees-can-help-open-access-journals-make-research-available-to-everyone-189675">Removing author fees can help open access journals make research available to everyone</a>
</strong>
</em>
</p>
<hr>
<p>Making publications, datasets, and other non-linguistic research results easy to find is another vital element, which requires both scientific and technical support. The same applies to expanding the corpus of scientific literature in Spanish and other languages, especially since this feeds into generative artificial intelligence models.</p>
<p>If linguistically diverse scientific content is not incorporated into AI systems, they will spread information that is incomplete, biased or misleading: a recent Spanish government <a href="https://portal.mineco.gob.es/es-es/comunicacion/Paginas/Informe_Corpus_publicaciones.aspx">report on the state of Spanish and co-official languages</a> points out that 90% of the text currently fed into AI is written in English.</p>
<h2>Deep study of terminology is essential</h2>
<p>Research into terminology is of the utmost importance in preventing the use of improvised, imprecise language or unintelligible jargon. It can also bring huge benefits for the quality of both human and machine translations, specialised language teaching, and the indexing and organisation of large volumes of documents. </p>
<p>Terminology work in Spanish is being carried out today thanks to the processing of large language corpuses by AI and researchers in the <a href="https://www.csic.es/es/actualidad-del-csic/el-proyecto-teresia-recuperara-y-fomentara-la-terminologia-en-espanol-aplicando">TeresIA</a> project, a joint effort coordinated by the Spanish National Research Council. However, 15 years of ups and downs were needed to to get such a project off the ground in Spanish. </p>
<p>The Basque Country, Catalonia and Galicia, on the other hand, have worked intensively and systematically on their respective languages. They have not only tackled terminology as a public language policy issue, but have also been committed to established terminology projects for a long time.</p>
<h2>Multiligualism is a global issue</h2>
<p>This need for broader diversity also applies to Ibero-America as a whole, where efforts are being coordinated to promote Spanish and Portuguese in academia, notably by the <a href="https://www.segib.org/en/">Ibero-American General Secretariat</a> and the <a href="https://conahcyt.mx/">Mexican National Council of Humanities, Sciences and Technologies</a>. </p>
<p>While this is sorely needed, we cannot promote the region’s two most widely spoken languages and also ignore its diversity of indigenous and co-official languages. These are also involved in the production of knowledge, and are a vehicle for the transfer of scientific information, as demonstrated by efforts in Spain.</p>
<p>Each country has its own unique role to play in promoting greater linguistic diversity in scientific communication. If this can be achieved, the strength of Iberian languages – and all languages, for that matter – in academia will not be at the mercy of well intentioned but sporadic efforts. It will, instead, be the result of the scientific community’s commitment to a culture of knowledge sharing.</p><img src="https://counter.theconversation.com/content/226198/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Elea Giménez Toledo no recibe salario, ni ejerce labores de consultoría, ni posee acciones, ni recibe financiación de ninguna compañía u organización que pueda obtener beneficio de este artículo, y ha declarado carecer de vínculos relevantes más allá del cargo académico citado.</span></em></p>Around 98% of all research is published in English, posing a serious problem for the global scientific community.Elea Giménez Toledo, Científica titular del Centro de Ciencias Humanas y Sociales (CCHS - CSIC), Centro de Ciencias Humanas y Sociales (CCHS - CSIC)Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2262252024-03-21T19:07:33Z2024-03-21T19:07:33ZPrestigious journals make it hard for scientists who don’t speak English to get published. And we all lose out<figure><img src="https://images.theconversation.com/files/583287/original/file-20240320-17-ek0zj5.jpeg?ixlib=rb-1.1.0&rect=0%2C12%2C4288%2C2830&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/stem-cotton-gossypium-hirsutum-microscopic-view-170232521">D. Kucharski K. Kucharska/Shutterstock</a></span></figcaption></figure><p>For the first time in history, a single language dominates global scientific communication. But the actual production of knowledge continues to be a multilingual enterprise.</p>
<p>The use of English as the norm poses challenges for scholars from regions where English is not widely spoken. They must decide whether to publish in English for global visibility, or publish in their native language to make their work accessible to local communities. And when they work in English, they end up <a href="https://theconversation.com/non-native-english-speaking-scientists-work-much-harder-just-to-keep-up-global-research-reveals-208750">expending more time and effort</a> writing and revising papers than their native English-speaking peers.</p>
<p>As gatekeepers of scientific knowledge, academic publishers play a key role in helping or hindering the participation of a multilingual scientific community. So how are they doing?</p>
<p>We reviewed the policies of 736 journals in the biological sciences and discovered the great majority are making only minimal efforts to overcome language barriers in academic publishing. Our research is <a href="https://royalsocietypublishing.org/doi/10.1098/rspb.2023.2840">published in Proceedings of the Royal Society B</a>.</p>
<h2>A wide range of inclusive policies</h2>
<p>Linguistically inclusive policies come in many forms, and can be implemented at each stage of the editorial process. They might aim to make publishing more multilingual. Alternatively – if sticking with English – they may aim to reduce the burden on non-native English speakers.</p>
<p>Allowing papers to be published in more than one language at the same time would resolve the dilemma many non-native English speaking scholars face about communicating locally or globally. However, only 7% of the journals we surveyed allowed this possibility. (A further 11% will allow multilingual versions of an abstract alone.)</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/non-native-english-speaking-scientists-work-much-harder-just-to-keep-up-global-research-reveals-208750">Non-native English speaking scientists work much harder just to keep up, global research reveals</a>
</strong>
</em>
</p>
<hr>
<p>Another possibility would be to implement machine translation tools to make versions of an article available in multiple languages on a journal’s website. There has been recent <a href="https://academic.oup.com/bioscience/article/72/10/988/6653151">progress in this area</a>, but only 11% of journals we surveyed have put it into practice. </p>
<p>Journals can also indicate they value submissions from authors from diverse linguistic backgrounds by explicitly declaring they will not reject manuscripts solely on the basis of the perceived quality of the English. Surprisingly, we found only two journals stated this.</p>
<p>Similarly, providing author guidelines in multiple languages would further encourage submissions from diverse authors. While 11% of the journals we examined translate specific sections of their guidelines to other languages, only 8% offer their entire guidelines in more than one language.</p>
<p>To ensure published research learns from the scientific contributions of <a href="https://besjournals.onlinelibrary.wiley.com/doi/10.1111/1365-2664.14370">scholars from around the globe</a>, journals should explicitly allow or encourage non-English literature to be cited. Only one tenth of journals mention this in author guidelines.</p>
<p>Journals may also adopt measures to ensure work submitted by non-native English speakers is assessed fairly. One such measure is the provision of English-language editing services. </p>
<p>More than half the journals we surveyed refer authors to some kind of editing services; only 1% offer the service free of charge to authors. The cost of editing may impose a considerable <a href="https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0238372">financial burden</a> on scholars in lower-income countries.</p>
<p>Another measure is to educate reviewers and editors about language barriers and instruct them to assess the manuscripts based on their research attributes alone. This is something only 4–6% of journals implement.</p>
<h2>Drivers of inclusivity</h2>
<p>We also identify two key influences on a journal’s adoption of linguistically inclusive policy. </p>
<p>The first is impact factor, a measure commonly taken to represent the prestige of a journal. We found journals with higher impact factors tend to adopt less-inclusive policies, possibly because they mostly target English-proficient authors and readers.</p>
<p>The second influence is ownership by a scientific society. Journals owned by scientific societies tended to adopt more inclusive policies. They have also taken the lead in the movement to publish multilingual content.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/what-is-open-access-and-why-should-we-care-11608">What is open access and why should we care?</a>
</strong>
</em>
</p>
<hr>
<p>Many scientific societies have a mandate to <a href="https://anatomypubs.onlinelibrary.wiley.com/doi/10.1002/ar.24735">foster diverse communities</a>. They are supported by their members and are well positioned to push for a cultural change in scientific publishing.</p>
<p>We also found that open access journals (which make research available to the public for free) were no more likely to adopt inclusive linguistic policies, nor were journals with more diverse editorial boards. </p>
<p>The apparent lack of influence of linguistically diverse board members is a puzzle. Perhaps editors who have experienced language barriers in their own professional life do not advocate for non-native English speaking authors. Or perhaps editorial boards have less power to define editorial policies than we might expect.</p>
<h2>Language barriers</h2>
<p>Language barriers deepen geographic divides, hampering knowledge sharing. Tackling them in academic publishing becomes critical to effectively address both regional and global issues, such as health and conservation.</p>
<p>In our study, we looked at a number of linguistically inclusive policies, but there are plenty of other things journals can do to help scientists from non-English speaking backgrounds. These range from <a href="https://www.science.org/doi/10.1126/science.adg9714">using artificial intelligence tools</a> to the re-negotiation of copyrights to <a href="https://academic.oup.com/iob/article/5/1/obad003/7008844">authorise the publication of translations</a> elsewhere.</p><img src="https://counter.theconversation.com/content/226225/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Henry Arenas-Castro does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A study of 736 biological science journals showed only a small fraction are making efforts to foster a multilingual scientific community.Henry Arenas-Castro, Postdoctoral Fellow, Yale University, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1913432022-10-09T19:10:31Z2022-10-09T19:10:31ZNew ‘ethics guidance’ for top science journals aims to root out harmful research – but can it succeed?<figure><img src="https://images.theconversation.com/files/487469/original/file-20220930-24-hj7oj4.jpeg?ixlib=rb-1.1.0&rect=6%2C31%2C4191%2C2599&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/RlOAwXt2fEA">Julia Koblitz / Unsplash</a></span></figcaption></figure><p>The British journal Nature was founded in 1869 and is one of the world’s most influential and prestigious outlets for scientific research. Its publisher, Nature Portfolio (a subsidiary of the academic publishing giant Springer Nature), also publishes <a href="https://www.nature.com/siteindex">dozens of specialised journals</a> under the Nature banner, covering almost every branch of science.</p>
<p>In August, the company published <a href="https://www.nature.com/nature-portfolio/editorial-policies/ethics-and-biosecurity">new ethics guidance</a> for researchers. The new guidance is <a href="https://www.nature.com/articles/d41586-022-03035-6">part</a> of Nature’s “attempt to acknowledge and learn from our troubled deep and recent past, understand the roots of injustice and work to address them as we aim to make the scientific enterprise open and welcoming to all”.</p>
<p>An accompanying <a href="https://www.nature.com/articles/s41562-022-01443-2">editorial</a> argues the ethical responsibility of researchers should include people and groups “who do not participate in research but may be harmed by its publication”. </p>
<p>It also notes that for some research, “potential harms to the populations studied may outweigh the benefit of publication”, and licenses editors to make such determinations. Editors may modify, amend or “correct” articles post-publication. They may also decline to publish, or retract, objectionable content or articles, such as “[s]exist, misogynistic and/or anti-LGBTQ+ content”.</p>
<p>The guidance is correct to say academic freedom, like other freedoms, is not absolute. It’s also legitimate to suggest science can indirectly harm social groups, and their rights may sometimes trump academic freedom. Despite this, some aspects of the new guidance are concerning.</p>
<h2>When science goes wrong</h2>
<p>There’s no doubt science can cause harm, both for its subjects and other groups. Consider an example from the late 19th century. </p>
<p>Harvard professor Edward Clarke proposed that taking part in higher education would cause fertility problems in women, because energy would be diverted from the reproductive system to the brain. </p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/487432/original/file-20220930-18-y7spf6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/487432/original/file-20220930-18-y7spf6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1037&fit=crop&dpr=1 600w, https://images.theconversation.com/files/487432/original/file-20220930-18-y7spf6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1037&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/487432/original/file-20220930-18-y7spf6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1037&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/487432/original/file-20220930-18-y7spf6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1303&fit=crop&dpr=1 754w, https://images.theconversation.com/files/487432/original/file-20220930-18-y7spf6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1303&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/487432/original/file-20220930-18-y7spf6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1303&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Edward Clarke’s Sex in Education; or, a Fair Chance for Girls, argued that girls were physically unsuited to education.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/Edward_Hammond_Clarke#/media/File:Sex_in_Education_-_or_a_Fair_Chance_for_the_Girl.jpg">Wikimedia</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Clarke’s account, set out in a bestselling book, has been credited with deepening public opposition to universities opening their doors to women. </p>
<p>At first glance, this seems like exactly the kind of objectionable content that Nature’s new guidance says it would seek to amend or retract. </p>
<p>But the problem with Clarke’s account was not the offensive conclusions it drew about women’s capacity for intellectual development, or the discriminatory policies to which it gave support. </p>
<p>After all, suppose he had been right? If attending university really would harm women’s reproductive health, surely they would want to know.</p>
<p>The real problem with Clarke’s work was that it was bad science. Indeed, historian of science Naomi Oreskes <a href="https://books.google.com.au/books?id=zRMCEAAAQBAJ&lpg=PA79&ots=6oFcOphIWy&dq=Feminists%20in%20the%20late%20nineteenth%20century%20found%20Clarke%E2%80%99s%20agenda%20transparent%20and%20his%20non-empirical%20methodology%20ripe%20for%20attack.&pg=PA79#v=onepage&q=Feminists%20in%20the%20late%20nineteenth%20century%20found%20Clarke%E2%80%99s%20agenda%20transparent%20and%20his%20non-empirical%20methodology%20ripe%20for%20attack.&f=false">has noted</a>: </p>
<blockquote>
<p>Feminists in the late nineteenth century found Clarke’s agenda transparent and his non-empirical methodology ripe for attack.</p>
</blockquote>
<p>So drawing a particular kind of conclusion about women and girls isn’t what makes for sexist content in science. Nor is it favouring one side or another on gender-related policies. So what is it?</p>
<p>One answer is that it is science in which gendered assumptions bias scientists’ decisions. In the <a href="https://www.jstor.org/stable/40985708">words</a> of historian and philosopher of science Sarah Richardson, this is science in which: </p>
<blockquote>
<p>gendered practices or assumptions in a scientific field prevented researchers from accurately interpreting data, caused inferential leaps, blocked the consideration of alternative hypotheses, overdetermined theory choice, or biased descriptive language.</p>
</blockquote>
<h2>Language and labels</h2>
<p>The guidance also stipulates scientists should “use inclusive, respectful, non-stigmatizing language”. This merits pause for thought. </p>
<p>Scientists should certainly be thoughtful about language, and avoid causing unnecessary offence, hurt or stigma. However, the language must also be scientifically useful and meaningful.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/whats-at-risk-if-scientists-dont-think-strategically-before-talking-politics-63797">What's at risk if scientists don't think strategically before talking politics</a>
</strong>
</em>
</p>
<hr>
<p>For example, it is the nature of categories that some entities or individuals are excluded from them. This should be based on scientific criteria, not political ones. </p>
<p>Or consider the following, offered as part of working definitions in the guidance: </p>
<blockquote>
<p>There is a broad range of gender identities including, but not limited to, transgender, gender-queer, gender-fluid, non-binary, gender-variant, genderless, agender, nongender, bi-gender, trans man, trans woman, trans masculine, trans feminine and cisgender.</p>
</blockquote>
<p>People should of course be able to identify with whatever gender label they prefer. However, “gender identity” is a vague and contested concept, and these labels (and their meanings) are subjectively defined and continue to change rapidly over time. </p>
<p>Labels that are personally meaningful, deeply felt or – as in some cases – part of a political project to dismantle gender binaries, may not necessarily be scientifically useful. </p>
<h2>An invitation to politicking</h2>
<p>By casting a wide range of content as potentially subject to editorial intervention or veto on the grounds of harm, the guidance opens the door to the politicisation of science. Other material caught in that net is:</p>
<blockquote>
<p>content that undermines – or could reasonably be perceived to undermine – the rights and dignities of an individual or human group on the basis of socially constructed or socially relevant human groupings.</p>
</blockquote>
<p>But scientists often do research providing information used to make policies, which will include the bestowing of various rights. The findings of such research can therefore sometimes be unpalatable to groups with economic, political, religious, emotional or other vested interests. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/getting-a-scientific-message-across-means-taking-human-nature-into-account-70634">Getting a scientific message across means taking human nature into account</a>
</strong>
</em>
</p>
<hr>
<p>The guidance opens the door for such groups to try to have findings contrary to those interests “corrected” or retracted. There is not much that can’t be framed as a right, a harm, or an infringement of dignity – all notoriously difficult concepts to define and reach consensus on.</p>
<p>What will determine who is successful in their attempt to have articles amended or retracted? Potential harms will be assessed by journal editors and reviewers – and they will perceive these through the lens of their own prior assumptions, ideologies and value systems. </p>
<p>Editors may also face pressure to avoid tarnishing their journal brand, either in response to, or in anticipation of, social media mobs. After all, Springer Nature ultimately answers to its shareholders.</p>
<h2>The responsibility of editors</h2>
<p>As we know from the work of feminist and other critical scholars, scientific claims based on biased research have harmed marginalised groups in many ways: by explaining away group inequalities in status, power and resources; pathologising; stigmatising; and justifying denial of rights.</p>
<p>There is no contradiction between acknowledging these harms, and also having concerns about the new Nature guidance. </p>
<p>Science journals have an important role to play in facilitating socially responsible science in these sensitive areas.</p>
<p>Journal editors should certainly do all they can to discover and scrutinise hidden biases embedded in research, such as by commissioning reviews from experts with different or critical perspectives. However, they should not second-guess what scientific claims will cause social harm, then exercise a veto.</p><img src="https://counter.theconversation.com/content/191343/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Cordelia Fine does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Nature’s recent efforts to redefine the ethical responsibilities of scientists leave a lot to be desired.Cordelia Fine, Professor, History & Philosophy of Science program, School of Historical & Philosophical Studies, The University of MelbourneLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1894662022-08-29T20:03:06Z2022-08-29T20:03:06ZThe US has ruled all taxpayer-funded research must be free to read. What’s the benefit of open access?<figure><img src="https://images.theconversation.com/files/481466/original/file-20220829-50806-wh9yvf.jpg?ixlib=rb-1.1.0&rect=1276%2C815%2C3491%2C2933&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/6ywyo2qtaZ8">Eugenio Mazzone/Unsplash</a></span></figcaption></figure><p>Last week, the United States announced an <a href="https://www.whitehouse.gov/ostp/news-updates/2022/08/25/breakthroughs-for-alldelivering-equitable-access-to-americas-research/">updated policy guidance</a> on open access that will substantially expand public access to science not just in America, but worldwide.</p>
<p>As per the guidance, all US federal agencies must put in place policies and plans so anyone anywhere can immediately and freely access the peer-reviewed publications and data arising from research they fund.</p>
<p>The policies need to be in place by the end of 2025, according to President Biden’s White House Office of Science and Technology Policy (OSTP).</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1563247442065141761"}"></div></p>
<h2>A substantial step</h2>
<p>The new guidance builds <a href="https://obamawhitehouse.archives.gov/blog/2013/02/22/expanding-public-access-results-federally-funded-research">on a previous memo</a> issued by then president Barack Obama’s office in 2013. That one only applied to the largest funding agencies and, in a crucial difference, allowed for a 12-month delay or embargo for the publications to be available.</p>
<p>Now we’re seeing a substantial step forward in a lengthy effort – extending back to the <a href="https://www.budapestopenaccessinitiative.org/">beginning of this century</a> – to open up access to the world’s research.</p>
<p>We can expect it to act as a catalyst for more policy changes globally. It’s also especially timely given UNESCO’s <a href="https://www.unesco.org/en/natural-sciences/open-science">Open Science Recommendation</a> adopted in 2021. The new OSTP guidance emphasises the primary intention is for the US public to have immediate access to research funded by their tax dollars.</p>
<p>But thanks to the conditions for opening up said research, people worldwide will benefit.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/busting-the-top-five-myths-about-open-access-publishing-14792">Busting the top five myths about open access publishing</a>
</strong>
</em>
</p>
<hr>
<h2>A discriminatory system</h2>
<p>It might seem obvious that with our ubiquitous internet access, there should already be immediate open access to publicly funded research. But that isn’t the case for most published studies.</p>
<p>Changing the system has been challenging, not least because academic publishing is dominated by a small number of <a href="https://dx.plos.org/10.1371/journal.pone.0127502">highly profitable and powerful publishers</a>.</p>
<p>Open access matters for both the public and academics, as the fast-moving emergency of the COVID-19 pandemic amply demonstrated.</p>
<p>Even academics at well-funded universities can mostly only access journals their universities subscribe to – and no institution can afford to subscribe to everything published. Last year, estimates suggest some 2 million research articles were published. People outside a university – in a small company, a college, a GP practice, a newsroom, or citizen scientists – have to pay for access.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1390798132632596488"}"></div></p>
<p>As the new guidance notes, this lack of public access leads to “discrimination and structural inequalities… [that] prevent some communities from reaping the rewards of the scientific and technological advancements”. Furthermore, lack of access leads to mistrust in research.</p>
<p>The accompanying <a href="https://www.whitehouse.gov/wp-content/uploads/2022/08/08-2022-OSTP-Public-Access-Memo.pdf">OSTP memo</a> highlights that future policies should support scientific and research integrity, with the aim of increasing public trust in science.</p>
<p>COVID-19 is not the first rapid global emergency, and it won’t be the last. For example, doctors not being able <a href="https://www.nytimes.com/2015/04/08/opinion/yes-we-were-warned-about-ebola.html">to access research on Ebola</a> may have directly led to a 2015 outbreak in West Africa.</p>
<p>In the early stages of the COVID-19 pandemic, the <a href="https://trumpwhitehouse.archives.gov/wp-content/uploads/2020/03/COVID19-Open-Access-Letter-from-CSAs.Equivalents-Final.pdf">White House led calls</a> for publishers to make COVID-19 publications open to all. Most (but not all) did and that call led to one of the biggest databases of openly available papers ever assembled – the <a href="https://allenai.org/data/cord-19">CORD-19 database</a>.</p>
<p>But not all of those COVID-19 papers will be permanently openly available, since some publishers put conditions on their accessibility. With the current spread of monkeypox, we are potentially facing another global emergency. In August this year, the White House once again <a href="https://www.whitehouse.gov/ostp/news-updates/2022/08/04/a-call-for-public-access-to-monkeypox-related-research-and-data/">called for publishers</a> to make relevant research open.</p>
<p>The OSTP guidance will finally mean that, at least for US federally funded research, the time of governments having to repeatedly call for publishers to make research open is over.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/not-just-available-but-also-useful-we-must-keep-pushing-to-improve-open-access-to-research-86058">Not just available, but also useful: we must keep pushing to improve open access to research</a>
</strong>
</em>
</p>
<hr>
<h2>The situation in Australia</h2>
<p>In Australia, we don’t yet have a national approach to open access. The two national research funders, the <a href="https://www.nhmrc.gov.au/about-us/resources/open-access-policy">NHMRC</a> and <a href="https://www.arc.gov.au/about-arc/program-policies/open-access-policy">ARC</a>, have policies in place similar to the 2013 US guidance of a 12-month embargo period. The NHMRC consulted last year on an immediate open access policy.</p>
<p>All Australian universities provide access to their research through their repositories, although that access varies depending on individual universities’ and publishers’ policies. Most recently, <a href="https://caul.libguides.com/read-and-publish">the Council of Australian University Librarians negotiated</a> a number of consortial open access deals with publishers. Cathy Foley, Australia’s Chief Scientist, is also considering a <a href="https://www.chiefscientist.gov.au/Dr-Cathy-Foley-delivers-National-Press-Club-Address">national model for open access</a>.</p>
<p>So what’s next? As expected, perhaps, some of the larger publishers are already <a href="https://www.nytimes.com/2022/08/25/us/white-house-federally-funded-research-access.html">making the case</a> for more funding for them to support this policy. It will be important that this policy doesn’t lead to a financial bonanza for these already very profitable companies – nor a consolidation of their power.</p>
<p>Rather, it would be good to see financial support for innovation in publishing, and a recognition that we need a <a href="https://www.coar-repositories.org/news-updates/fostering-bibliodiversity-in-scholarly-communications-a-call-for-action/">diversity of approaches</a> to support an academic publishing system that works for the benefit of all.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/making-australian-research-free-for-everyone-to-read-sounds-ideal-but-the-chief-scientists-open-access-plan-isnt-risk-free-171389">Making Australian research free for everyone to read sounds ideal. But the Chief Scientist's open-access plan isn't risk-free</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/189466/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Virginia Barbour is the Director Of Open Access Australasia, which advocates for Open Access in Australia and Aotearoa New Zealand.</span></em></p>Lack of free access to research leads to discrimination, both in academia and for us all. The new guidance from the US is a huge step in the right direction.Virginia Barbour, Director, Open Access Australasia, Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1733212021-12-16T13:28:19Z2021-12-16T13:28:19ZSurveys of scientists show women and young academics suffered most during pandemic and may face long-term career consequences<figure><img src="https://images.theconversation.com/files/437659/original/file-20211214-27402-1j8amls.jpg?ixlib=rb-1.1.0&rect=0%2C201%2C6448%2C4245&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Working from home comes with many distractions.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/mother-working-from-home-with-children-in-royalty-free-image/1273890998?adppopup=true">MoMoProductions/Digital Vision via Getty Images</a></span></figcaption></figure><p>On March 6, 2020, universities across the U.S. announced systematic <a href="https://www.nytimes.com/2020/03/06/us/coronavirus-college-campus-closings.html">laboratory closures, social distancing policies and travel bans</a> to cope with the growing coronavirus epidemic. These actions, while prudent and necessary, had immediate negative impacts on the academic enterprise of science in the U.S. and around the world.</p>
<p>We are a team of <a href="https://scholar.google.com/citations?user=DGHsTEgAAAAJ&hl=en&oi=sra">researchers</a> who <a href="https://spa.asu.edu/content/lesley-michalegko">study</a> the <a href="https://scholar.google.com/citations?user=CG9lGUgAAAAJ&hl=en&oi=sra">role of science</a> and technology <a href="https://scholar.google.com/citations?user=AXfiRyYAAAAJ&hl=en&oi=sra">in society</a>. We are also part of a collaborative, multi-university project, called SciOPS, that seeks to improve <a href="https://news.asu.edu/20210128-global-engagement-sciops-gives-us-look-scientists-minds">how scientists communicate with the public</a>. As the pandemic wore on, researchers began telling us about the work stoppages, data losses and other hardships they were experiencing. We felt this was important information, so we conducted two surveys to understand how the pandemic was affecting researchers.</p>
<p>The pandemic’s hardships in academia have been widespread and lasting, but our analyses revealed that <a href="https://doi.org/10.1057/s41599-021-00823-9">female and early career scientists faced more negative impacts</a> than other groups. These differences are likely aggravating already existing disparities and potentially altering career trajectories. The negative outcomes may last well beyond the end of the pandemic. </p>
<h2>A survey of researchers</h2>
<p>The SciOPS team conducted its first COVID-19 survey in May 2020, with a follow-up exactly a year later in May 2021. For each, we invited faculty from a random sample of 21 U.S. research universities who work in biology, engineering and biochemistry to participate in the study, and about 300 scientists responded each time. Through a series of multiple choice and open-ended questions, the surveys asked how researchers had been affected both professionally and personally by the pandemic. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/437661/original/file-20211214-17-1mdremf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A sign in a door saying that a university building is closed indefinitely." src="https://images.theconversation.com/files/437661/original/file-20211214-17-1mdremf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/437661/original/file-20211214-17-1mdremf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=373&fit=crop&dpr=1 600w, https://images.theconversation.com/files/437661/original/file-20211214-17-1mdremf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=373&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/437661/original/file-20211214-17-1mdremf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=373&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/437661/original/file-20211214-17-1mdremf.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=468&fit=crop&dpr=1 754w, https://images.theconversation.com/files/437661/original/file-20211214-17-1mdremf.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=468&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/437661/original/file-20211214-17-1mdremf.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=468&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Closures of schools and labs forced many scientists to work from home.</span>
<span class="attribution"><a class="source" href="https://newsroom.ap.org/detail/VirusOutbreakColorado/a2a120dd5a6d45048f24e863314e6e33/photo?Query=university%20closed%20virus%20sign&mediaType=photo&sortBy=&dateRange=Anytime&totalCount=8&currentItemNo=4">AP Photo/David Zalubowski</a></span>
</figcaption>
</figure>
<h2>How the coronavirus disrupted science</h2>
<p>Our first survey found that disruptions at work and home negatively affected research activities for a vast majority of the scientists who responded. </p>
<p>On the <a href="https://www.sci-ops.org/surveys/covid-19-survey-ii-2021-impacts-on-scientific-research">research side</a>, 93% of respondents experienced university shutdowns and 88% faced lab work disruptions. Over 80% dealt with conference cancellations and travel restrictions. Some researchers also had to quickly adapt to financial issues, and this, along with other hurdles, saw many scientists delaying data collection, applying for timeline extensions or ending data collection early.</p>
<p>Challenges at <a href="https://www.sci-ops.org/surveys/covid-19-personal-impacts">home also affected scientists’ work</a>. Roughly 80% of respondents said they were unable to concentrate on research activities, 72% had anxiety about contracting COVID-19 and 36% had to manage unexpected child care responsibilities. </p>
<p>The May 2021 survey showed that a year later, not much had changed. Responses were nearly identical: 92% of scientists reported difficulties from university closures, 89% experienced lab work disruptions and 84% had collaboration disruptions that had interrupted their research over the past year.</p>
<p><a href="https://www.sci-ops.org/surveys/covid-19-survey-ii-2021-personal-impacts">Issues at home were nearly the same</a> as the year prior, too. The only major difference was that 11% percent of respondents reported coping with a family member’s illness, compared to only 3% in 2020.</p>
<p>Inevitably, these stressors all took a toll on researchers’ well-being. Nearly 60% indicated that their overall mental health and happiness had decreased because of the pandemic. This is higher than a Centers for Disease Control and Prevention study that found <a href="http://dx.doi.org/10.15585/mmwr.mm6932a1">40% of the U.S. general public were facing mental health issues</a> in June 2021. As one researcher stated, reiterating the sentiments of many others in our study: “The mental impact of lockdown affected every researcher in my lab, including me. It was far more damaging than anything else we experienced and caused huge drop-offs in productivity.” </p>
<h2>Younger researchers and female researchers faced more difficulties</h2>
<p>Some scientists felt the added stress from a lack of boundaries between home and work much more acutely than others. The unexpected rises in parental child care and virtual schooling fell most heavily on female and early career faculty. </p>
<p>In our 2020 survey, 34% percent of female scientists reported disruptions due to unexpected child care responsibilities, compared to 21% of males. Early career faculty struggled more too. Roughly 43% of assistant professors indicated unexpected child care duties caused major disruptions to their research, <a href="https://doi.org/10.1057/s41599-021-00823-9">30% more than their most senior colleagues</a>. In total, nearly 50% of both female respondents and assistant professors reported an inability to concentrate on research activities, while only 29% of male colleagues and 36% of senior colleagues reported the same.</p>
<p><iframe id="6vK2B" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/6vK2B/8/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>These unequal burdens barely changed between 2020 and 2021. If anything, issues got worse for female scientists. Many reported other unanticipated complications such as management of other family members’ mental health, divorce and limited space at home. </p>
<p>Given the extra burdens young researchers and female researchers are facing, it’s no surprise their work suffered. Other research has shown that during the pandemic, female scientists had <a href="https://doi.org/10.1038/s41562-020-0921-y">significantly less time to work on research</a>. Many were not able to meet deadlines, and so they <a href="https://doi.org/10.3389/fpsyg.2021.663252">submitted fewer manuscripts</a> compared to pre-pandemic levels. </p>
<p>Unsurprisingly, these impacts on productivity were <a href="https://doi.org/10.3389/fpsyg.2021.663252">even worse for women with children</a>. Research has shown that home disruptions can cascade over time and result in <a href="https://www.nature.com/articles/s41562-020-0921-y">delayed promotions and tenure</a>. Even pre-COVID-19, working mothers in academia left their respective fields at much higher rates than their male colleagues, and this trend was <a href="https://doi.org/10.1513/AnnalsATS.202006-589IP">further amplified by the pandemic</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/437662/original/file-20211214-27-11jx78v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A screen with many faces on it participating in a video conference." src="https://images.theconversation.com/files/437662/original/file-20211214-27-11jx78v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/437662/original/file-20211214-27-11jx78v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/437662/original/file-20211214-27-11jx78v.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/437662/original/file-20211214-27-11jx78v.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/437662/original/file-20211214-27-11jx78v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/437662/original/file-20211214-27-11jx78v.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/437662/original/file-20211214-27-11jx78v.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Researchers figured out ways to work around challenges posed by the pandemic.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/social-teleconference-during-covid-19-royalty-free-image/1217489268?adppopup=true">GabrielPevide/E+ via Getty Images</a></span>
</figcaption>
</figure>
<h2>Adapting to the new world</h2>
<p>Undoubtedly, the pandemic has had devastating effects on academic research and those who do it. But hidden among the gloom of our surveys were a few bright spots that highlight the resilience of the scientific community.</p>
<p>In our 2020 survey, 37% of scientists said that they developed new research topics to pursue, and 22% developed new collaborations. Virtual meetings proved to be a valuable transition for some. As one researcher noted, “Through regular videoconference discussions, new and long-distance collaborations have been initiated and maintained between four labs in the U.S. This would have been never envisaged prior to the Zoom era.”</p>
<p>The pandemic highlighted existing problems within science but also offered lessons to be learned. Many in academia want to avoid <a href="https://www.insidehighered.com/views/2021/02/10/without-intentional-interventions-pandemic-will-make-higher-education-less-diverse">deepening existing inequities in the scientific workforce</a>, and studies have <a href="https://doi.org/10.1371/journal.pbio.3001100">outlined ways to do this</a>. By implementing programs such as tenure clock extensions, advocating for affordable child care and allocating funds to support early career women researchers, the scientific community could enable broader participation, capacity and production for all scientists. </p>
<p>Looking forward, we believe it is critically important for universities and research funders to proactively address the continuing challenges posed by the pandemic, particularly for female and early career faculty. With so much in flux, there is an opportunity to change and improve a system that wasn’t working for a lot of people prior to the pandemic.</p>
<p>[<em>Get the best of The Conversation, every weekend.</em> <a href="https://memberservices.theconversation.com/newsletters/?nl=weekly&source=inline-weeklybest">Sign up for our weekly newsletter</a>.]</p><img src="https://counter.theconversation.com/content/173321/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Many scientists stuck at home during university closures dealt with increased domestic responsibilities. But some groups had it worse than others.Lesley Michalegko, Research Project Manager of Public Policy, Arizona State UniversityEric Welch, Professor & Director, Center for Science, Technology & Environmental Policy Studies, Arizona State UniversityMary K. Feeney, Professor and Lincoln Professor of Ethics in Public Affairs, Arizona State UniversityTimothy P. Johnson, Professor Emeritus of Public Administration, University of Illinois ChicagoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1502842021-01-03T18:57:17Z2021-01-03T18:57:17Z2020 locked in shift to open access publishing, but Australia is lagging<figure><img src="https://images.theconversation.com/files/374711/original/file-20201214-20-7isptm.jpg?ixlib=rb-1.1.0&rect=0%2C385%2C5574%2C3713&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/surreal-painting-opened-door-open-book-606475526">Bruce Rolff/Shutterstock</a></span></figcaption></figure><p>For all its faults, 2020 appears to have locked in momentum for the open access movement. But it is time to ask whether providing free access to published research is enough – and whether equitable access to not just reading but also making knowledge should be the global goal.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/L5rVH1KGBCY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">An explanation of open access and how the system of having to pay for access to published research came about.</span></figcaption>
</figure>
<p>In Australia the first challenge is to overcome the apathy about open access issues. The term “open access” has been too easy to ignore. Many consider it a low priority compared to achievements in research, obtaining grant funding, or university rankings glory.</p>
<p>But if you have a child with a rare disease and want access to the latest research on that condition, you get it. If you want to see new solutions to climate change identified and implemented, you get it. If you have ever searched for information and run into a paywall requiring you to pay more than your wallet holds to read a single journal article that you might not even find useful, you will get it. And if you are watching dire international headlines and want to see a rapid solution to the pandemic, you will probably get it.</p>
<p>Many publishing houses temporarily threw open their paywall doors during the year. Suddenly, there was free access to research papers and data for scholars researching pandemic-related issues, and also for students seeking to pursue their studies online across a range of disciplines.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/science-publishing-has-opened-up-during-the-coronavirus-pandemic-it-wont-be-easy-to-keep-it-that-way-142984">Science publishing has opened up during the coronavirus pandemic. It won't be easy to keep it that way</a>
</strong>
</em>
</p>
<hr>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/374703/original/file-20201214-15-1jxoc5f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Graphic showing benefits of open access" src="https://images.theconversation.com/files/374703/original/file-20201214-15-1jxoc5f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/374703/original/file-20201214-15-1jxoc5f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=420&fit=crop&dpr=1 600w, https://images.theconversation.com/files/374703/original/file-20201214-15-1jxoc5f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=420&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/374703/original/file-20201214-15-1jxoc5f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=420&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/374703/original/file-20201214-15-1jxoc5f.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=528&fit=crop&dpr=1 754w, https://images.theconversation.com/files/374703/original/file-20201214-15-1jxoc5f.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=528&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/374703/original/file-20201214-15-1jxoc5f.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=528&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="source" href="http://theblogworm.blogspot.com/2014/05/open-access-publishing-growing-area-at.html">Safia Begum/The Blogworm/Aston University</a></span>
</figcaption>
</figure>
<p>In October 2020, UNESCO <a href="https://en.unesco.org/covid19/communicationinformationresponse/opensolutions">made the case for open access to enhance research and information</a> on COVID-19. It also joined the World Health Organisation and UN High Commissioner for Human Rights in <a href="https://en.unesco.org/news/unesco-who-and-high-commissioner-human-rights-call-open-science">calling for open science</a> to be implemented at all stages of the scientific process by all member states.</p>
<p>There is clearly an appetite for freely available information. Since it was established earlier this year, the <a href="https://www.semanticscholar.org/cord19">CORD-19</a> website has built up a repository of more than 280,000 articles related to COVID-19. These have attracted tens of millions of views.</p>
<h2>Europe has led the way</h2>
<p>Europe was already ahead of the curve on open access and 2020 has accelerated the change. Plan S <a href="https://www.coalition-s.org/">is an initiative for open access</a> launched in Europe in 2018. It requires all projects funded by the European Commission and the European Research Council to be published open access.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/all-publicly-funded-research-could-soon-be-free-for-you-the-taxpayer-to-read-111825">All publicly funded research could soon be free for you, the taxpayer, to read</a>
</strong>
</em>
</p>
<hr>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/374697/original/file-20201214-13-1ck6zbf.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Chart showing growth in number of open access repositories" src="https://images.theconversation.com/files/374697/original/file-20201214-13-1ck6zbf.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/374697/original/file-20201214-13-1ck6zbf.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=433&fit=crop&dpr=1 600w, https://images.theconversation.com/files/374697/original/file-20201214-13-1ck6zbf.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=433&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/374697/original/file-20201214-13-1ck6zbf.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=433&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/374697/original/file-20201214-13-1ck6zbf.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=544&fit=crop&dpr=1 754w, https://images.theconversation.com/files/374697/original/file-20201214-13-1ck6zbf.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=544&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/374697/original/file-20201214-13-1ck6zbf.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=544&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Growth in the number of open access repositories listed in the international Registry of Open Access Repositories.</span>
<span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/Open_access#/media/File:ROAR_growth.png">Thomas Shafee/Wikipedia</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>A <a href="https://www.ouvrirlascience.fr/wp-content/uploads/2019/07/Cost-Benefit-analysis-for-FAIR-research-data_KI0219023ENN.pdf">2018 report</a> commissioned by the European Commission found the cost to Europeans of not having access to FAIR (findable, accessible, interoperable and reusable) research data was €10 billion ($A16.1 billion) a year.</p>
<p>In 2019, open access publications accounted for 63% of publications in the UK, 61% in Sweden and 54% in France, compared to 43% of Australian publications.</p>
<h2>Australia is lagging behind</h2>
<p>Australia’s flagship Australian Research Council has <a href="https://www.arc.gov.au/policies-strategies/policy/arc-open-access-policy">required all research outputs to be open access</a> since 2013. But researchers can choose not to publish open access if legal or contractual obligations require otherwise. This caveat has led to a relatively low rate of open access in Australia.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/374694/original/file-20201214-23-1ngrkwg.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="chart showing numbers of publications that are open access and behind paywalls" src="https://images.theconversation.com/files/374694/original/file-20201214-23-1ngrkwg.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/374694/original/file-20201214-23-1ngrkwg.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=251&fit=crop&dpr=1 600w, https://images.theconversation.com/files/374694/original/file-20201214-23-1ngrkwg.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=251&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/374694/original/file-20201214-23-1ngrkwg.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=251&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/374694/original/file-20201214-23-1ngrkwg.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=316&fit=crop&dpr=1 754w, https://images.theconversation.com/files/374694/original/file-20201214-23-1ngrkwg.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=316&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/374694/original/file-20201214-23-1ngrkwg.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=316&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The increase in the numbers of open access publications in Australia has been gradual.</span>
<span class="attribution"><a class="source" href="http://openknowledge.community/dashboards/coki-open-access-dashboard/">Open Access Dashboard/Curtin Open Knowledge Initiative (COKI)</a></span>
</figcaption>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/universities-spend-millions-on-accessing-results-of-publicly-funded-research-88392">Universities spend millions on accessing results of publicly funded research</a>
</strong>
</em>
</p>
<hr>
<p>The Council of Australian University Librarians (<a href="https://www.caul.edu.au/">CAUL</a>) and the Australasian Open Access Strategy Group (<a href="https://aoasg.org.au/">AOASG</a>) have long carried the torch for open access in Australia. But, without levers to drive change, they have struggled to change entrenched publishing practices of Australian academics.</p>
<p>Our Curtin Open Knowledge Initiative (COKI) project has examined open access across the world. We have analysed open access performance of individuals, individual institutions, groups of universities and nations in recent decades. The COKI <a href="http://openknowledge.community/dashboards/coki-open-access-dashboard/">Open Access Dashboard</a> offers a glimpse into a subset of this international data, providing insights into national open access performance.</p>
<p>This analysis shows a steady global shift towards open access publications.</p>
<p>For example, in November 2020, Springer Nature <a href="https://www.nature.com/articles/d41586-020-03324-y">announced</a> it would allow authors to publish open access in Nature and associated journals at a price of up to €9,500 (A$15,300) per paper from January 2021. This was a signal change for the publishing industry. One of the world’s most prestigious journals is overturning decades of closed-access tradition to throw open the doors, and committing to increasing its open access publications over time.</p>
<p>At the moment, the pricing of this model enables only a select group to publish open access. The publication cost is equivalent to the value of some Australian research grants. Pricing is expected to become more affordable over time.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/374699/original/file-20201214-24-65jouc.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Chart showing open access publication options" src="https://images.theconversation.com/files/374699/original/file-20201214-24-65jouc.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/374699/original/file-20201214-24-65jouc.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=313&fit=crop&dpr=1 600w, https://images.theconversation.com/files/374699/original/file-20201214-24-65jouc.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=313&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/374699/original/file-20201214-24-65jouc.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=313&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/374699/original/file-20201214-24-65jouc.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=393&fit=crop&dpr=1 754w, https://images.theconversation.com/files/374699/original/file-20201214-24-65jouc.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=393&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/374699/original/file-20201214-24-65jouc.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=393&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A quick guide to open access publishing: for researchers who wish to do this the required fee can be a significant deterrent.</span>
<span class="attribution"><a class="source" href="https://www.openaire.eu/a-quick-guide-to-open-access">OpenAire</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/increasing-open-access-publications-serves-publishers-commercial-interests-116328">Increasing open access publications serves publishers' commercial interests</a>
</strong>
</em>
</p>
<hr>
<h2>It’s not just about access to facts</h2>
<p>This international trend is a positive step for fans of freely available facts. However, we should not lose sight of other potentially larger issues at play in relation to open knowledge – that is, a level playing field for access to both published research and participation in research production. </p>
<p>Put another way, we need to pursue not only equity among knowledge takers but also among knowledge makers if we are to enable the world’s best thinkers to collaborate on the planet’s signature challenges.</p>
<p>All of this is good news for people who love to access information – but the bigger overall question for the higher education sector is about the conventions, traditions and trends that determine who gets to be considered for a job in a lab or a library or a lecture theatre. There is much more to be done to make our universities open for all – a future of equity in knowledge making as well as taking.</p><img src="https://counter.theconversation.com/content/150284/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lucy Montgomery receives funding from the Arcadia Fund, which supports work to preserve endangered cultural heritage, protect endangered ecosystems, and promote access to knowledge. She also receives funding from the Andrew W Mellon Foundation for work relating to open access. In addition to this, Montgomery is Director of Research for COARD: a not-for-profit consultancy providing insight into the use and impact of open access books and journals.</span></em></p>In many other countries, a majority of research publications are now open access, but the system of paying for access still dominates academic publishing in Australia.Lucy Montgomery, Program Lead, Innovation in Knowledge Communication, Curtin UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1429842020-07-27T19:57:00Z2020-07-27T19:57:00ZScience publishing has opened up during the coronavirus pandemic. It won’t be easy to keep it that way<figure><img src="https://images.theconversation.com/files/349502/original/file-20200727-33-1ybt0ia.jpg?ixlib=rb-1.1.0&rect=15%2C7%2C5097%2C2866&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Scientific publishing is not known for moving rapidly. In normal times, publishing new research can take months, if not years. Researchers prepare a first version of a paper on new findings and submit it to a journal, where it is often rejected, before being resubmitted to another journal, peer-reviewed, revised and, eventually, hopefully published. </p>
<p>All scientists are familiar with the process, but few love it or the time it takes. And even after all this effort – for which neither the authors, the peer reviewers, nor most journal editors, are paid – most research papers end up locked away behind expensive journal paywalls. They can only be read by those with access to funds or to institutions that can afford subscriptions. </p>
<h2>What we can learn from SARS</h2>
<p>The business-as-usual publishing process is poorly equipped to handle a fast-moving emergency. In the 2003 SARS outbreaks in Hong Kong and Toronto, for example, only <a href="https://doi.org/10.1371/journal.pmed.1000272">22% of the epidemiological studies</a> on SARS were even submitted to journals during the outbreak. Worse, only 8% were accepted by journals and 7% published before the crisis was over.</p>
<p>Fortunately, SARS was contained in a few months, but perhaps it could have been contained even quicker with better sharing of research. </p>
<p>Fast-forward to the COVID-19 pandemic, and the situation could not be more different. A highly infectious virus spreading across the globe has made rapid sharing of research vital. In many ways, the publishing rulebook has been thrown out the window. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/the-hunt-for-a-coronavirus-cure-is-showing-how-science-can-change-for-the-better-132130">The hunt for a coronavirus cure is showing how science can change for the better</a>
</strong>
</em>
</p>
<hr>
<h2>Preprints and journals</h2>
<p>In this medical emergency, the first versions of papers (preprints) are being submitted onto preprint servers such as <a href="https://www.medrxiv.org/">medRxiv</a> and <a href="https://www.biorxiv.org">bioRxiv</a> and made openly available within a day or two of submission. These preprints (now almost 7,000 papers on just these two sites) are being downloaded <a href="https://chanzuckerberg.com/newsroom/2-million-to-medrxiv-top-source-breaking-covid-19-research/">millions of times</a> throughout the world. </p>
<p>However, exposing scientific content to the public before it has been peer-reviewed by experts increases the risk it will be misunderstood. Researchers need to engage with the public to improve understanding of how scientific knowledge evolves and to provide ways to question scientific information constructively. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/researchers-use-pre-prints-to-share-coronavirus-results-quickly-but-that-can-backfire-137501">Researchers use 'pre-prints' to share coronavirus results quickly. But that can backfire</a>
</strong>
</em>
</p>
<hr>
<p>Traditional journals have also changed their practices. Many have made research relating to the pandemic immediately available, although some have specified the content will be locked back up once the pandemic is over. For example, a <a href="https://www.elsevier.com/connect/coronavirus-information-center">website</a> of freely available COVID-19 research set up by major publisher Elsevier states: </p>
<blockquote>
<p>These permissions are granted for free by Elsevier for as long as the Elsevier COVID-19 resource centre remains active.</p>
</blockquote>
<p>Publication at journals has also sped up, though it cannot compare with the phenomenal speed of preprint servers. Interestingly, it <a href="https://www.biorxiv.org/content/10.1101/2020.05.22.111294v1">seems</a> posting a preprint speeds up the peer-review process when the paper is ultimately submitted to a journal.</p>
<h2>Open data</h2>
<p>What else has changed in the pandemic? What has become clear is the power of aggregation of research. A notable initiative is the <a href="https://www.whitehouse.gov/briefings-statements/call-action-tech-community-new-machine-readable-covid-19-dataset/">COVID-19 Open Research Dataset (CORD-19)</a>, a huge, freely available public dataset of research (now more than 130,000 articles) whose development was led by the US White House Office of Science and Technology Policy.</p>
<p>Researchers can not only read this research but also reuse it, which is essential to make the most of the research. The reuse is made possible by two specific technologies: permanent unique identifiers to keep track of research papers, and machine-readable conditions (licences) on the research papers, which specify how that research can be used and reused. </p>
<p>These are Creative Commons licences like those that cover projects such as <a href="https://en.wikipedia.org/wiki/Wikipedia:Copyrights">Wikipedia</a> and <a href="https://theconversation.com/au/republishing-guidelines">The Conversation</a>, and they are vital for maximising reuse. Often the reading and reuse is done now at least in a first scan by machines, and research that is not marked as being available for use and reuse may not even be seen, let alone used. </p>
<p>What has also become important is the need to provide access to data behind the research papers. In a fast-moving field of research not every paper receives detailed scrutiny (especially of underlying data) before publication – but making the data available ensures claims can be validated.</p>
<p>If the data can’t be validated, the research should be treated with extreme caution – as happened to a <a href="https://www.abc.net.au/news/2020-06-05/hydroxychloroquine-study-the-lancet-peer-review-coronavirus/12324118">swiftly retracted paper</a> about the effects of hydroxychloroquine published by The Lancet in May.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/not-just-available-but-also-useful-we-must-keep-pushing-to-improve-open-access-to-research-86058">Not just available, but also useful: we must keep pushing to improve open access to research</a>
</strong>
</em>
</p>
<hr>
<h2>Overnight changes, decades in the making</h2>
<p>While opening up research literature during the pandemic may seem to have happened virtually overnight, these changes have been decades in the making. There were systems and processes in place developed over many years that could be activated when the need arose. </p>
<p>The international licences were developed by the <a href="https://creativecommons.org/">Creative Commons</a> project, which began in 2001. Advocates have been <a href="https://www.budapestopenaccessinitiative.org/">challenging</a> the dominance of commercial journal subscription models since the early 2000s, and open access journals and other publishing routes have been growing globally since then. </p>
<p>Even preprints are not new. Although more recently platforms for preprints have been growing across many disciplines, their origin is in <a href="https://arxiv.org/">physics</a> back in 1991.</p>
<h2>Lessons from the pandemic</h2>
<p>So where does publishing go after the pandemic? As in many areas of our lives, there are some positives to take forward from what became a necessity in the pandemic. </p>
<p>The problem with publishing during the 2003 SARS emergency wasn’t the fault of the journals – the system was not in place then for mass, rapid open publishing. As an editor at The Lancet at the time, I vividly remember we simply could not publish or even meaningfully process every paper we received. </p>
<p>But now, almost 20 years later, the tools are in place and this pandemic has made a compelling case for open publishing. Though there are initiatives ongoing across the globe, there is still a lack of coordinated, long term, high-level commitment and investment, especially by governments, to support key open policies and infrastructure. </p>
<p>We are not out of this pandemic yet, and we know that there are even bigger challenges in the form of climate change around the corner. Making it the default that research is open so it can be built on is a crucial step to ensure we can address these problems collaboratively.</p><img src="https://counter.theconversation.com/content/142984/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Virginia Barbour is employed by the Australasian Open Access Strategy Group.
She provides unpaid editorial advice to medRxiv, a preprint server.</span></em></p>Scientists and science publishers are sharing information as fast as they can during the COVID-19 pandemic. Speed and openness bring new challenges, but they are the way forward for research.Virginia Barbour, Director, Australasian Open Access Strategy Group, Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1416842020-07-08T18:52:50Z2020-07-08T18:52:50ZWhy the h-index is a bogus measure of academic impact<figure><img src="https://images.theconversation.com/files/345665/original/file-20200705-33922-1qdq6zc.jpg?ixlib=rb-1.1.0&rect=41%2C0%2C4625%2C3078&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A portrait of Albert Einstein on a transformer station in St.Petersburg, Russia.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>Earlier this year, <a href="https://www.nytimes.com/2020/05/12/magazine/didier-raoult-hydroxychloroquine.html">French physician and microbiologist Didier Raoult generated a media uproar over his controversial promotion of hydroxychloroquine to treat COVID-19</a>. The researcher has long pointed to his growing list of publications and high number of citations as an indication of his contribution to science, all summarized in his “h-index.” </p>
<p>The controversy over his recent research presents an opportunity to examine the weaknesses of the h-index, a metric that aims to quantify a researcher’s productivity and impact, used by many organizations to evaluate researchers for promotions or research project funding. </p>
<p>Invented in 2005 by the American physicist Jorge Hirsch, the <a href="https://dx.doi.org/10.1073%2Fpnas.0507655102">Hirsch-index</a> or h-index, is an essential reference for many researchers and managers in the academic world. It is particularly promoted and used in the biomedical sciences, a field where the massive number of publications makes any serious qualitative assessment of researchers’ work almost impossible. This alleged indicator of quality has become a mirror in front of which researchers admire themselves or sneer at the pitiful h-index of their colleagues and rivals.</p>
<p>Although experts in bibliometry — a branch of library and information sciences that uses statistical methods to analyze publications — have quickly pointed out <a href="https://mitpress.mit.edu/books/bibliometrics-and-research-evaluation">the dubious nature of this composite indicator</a>, most researchers do not always seem to understand that its properties make it a far-from-valid index to seriously and ethically assess the quality or scientific impact of publications.</p>
<p>Promoters of the h-index commit an elementary error of logic. They assert that because Nobel Prize winners generally having a high h-index, the measure is a valid indicator of the individual quality of researchers. However, if a high h-index can indeed be associated with a Nobel Prize winner, this in no way proves that a low h-index is necessarily associated with a researcher of poor standing. </p>
<p>Indeed, a seemingly low h-index can hide a high scientific impact, at least if one accepts that the usual unit of measure for scientific visibility is reflected in the number of citations received.</p>
<h2>Limits of the h-index</h2>
<p>Defined as the number of articles <em>N</em> by an author that have each received at least <em>N</em> citations, the h-index is limited by the total number of published articles. For instance, if a person has 20 articles that are each cited 100 times, her h-index is 20 — just like a person who also has 20 articles, but each cited only 20 times. But no serious researcher would say that the two are equal because their h-index is the same.</p>
<p>The most ironic in the history of the h-index is that its inventor wanted to counter the claim that the number of published papers represented a researcher’s impact. So, he included the number of citations the articles received. </p>
<p>But it turns out that an author’s h-index is strongly correlated (up to about 0.9) with his total number of publications. In other words, it is the number of publications that drives the index more than the number of citations, an indicator which remains the best measure of the visibility of scientific publications.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/345699/original/file-20200706-29-ao7mpj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/345699/original/file-20200706-29-ao7mpj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/345699/original/file-20200706-29-ao7mpj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=454&fit=crop&dpr=1 600w, https://images.theconversation.com/files/345699/original/file-20200706-29-ao7mpj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=454&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/345699/original/file-20200706-29-ao7mpj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=454&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/345699/original/file-20200706-29-ao7mpj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=571&fit=crop&dpr=1 754w, https://images.theconversation.com/files/345699/original/file-20200706-29-ao7mpj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=571&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/345699/original/file-20200706-29-ao7mpj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=571&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Raoult Didier made front-page news in France for promoting hydroxychloroquine as a remedy for COVID-19.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>All of this is well known to experts in bibliometrics, but perhaps less to researchers, managers and journalists who allow themselves to be impressed by scientists parading their h-index. </p>
<h2>Raoult vs. Einstein</h2>
<p>In a recent investigation into Raoult’s research activities by the French newspaper <em>Médiapart</em>, a researcher who had been a member of the evaluation committee of Raoult’s laboratory said: “<a href="https://www.mediapart.fr/journal/france/070420/chloroquine-pourquoi-le-passe-de-didier-raoult-joue-contre-lui">What struck her was Didier Raoult’s obsession with his publications. A few minutes before the evaluation of his unit began, the first thing he showed her on his computer was his h-index</a>.” Raoult had also said in <em>Le Point</em> magazine in 2015 that “<a href="https://www.lepoint.fr/invites-du-point/didier_raoult/raoult-evaluer-la-recherche-mesurer-ou-tricher-04-10-2015-1970477_445.php">it was necessary to count the number and impact of researchers’ publications to assess the quality of their work</a>.” </p>
<p>So let’s take a look at Raoult’s h-index and see how it compares to, say, that of a researcher who is considered the greatest scientist of the last century: Albert Einstein.</p>
<p>In the <a href="https://www.webofknowledge.com/">Web of Science database</a>, Raoult has 2,053 articles published between 1979 and 2018, having received a total of 72,847 citations. His h-index calculated from these two numbers is 120. We know, however, that the value of this index can be artificially inflated through author self-citations — when an author cites his own previous papers. The database indicates that among the total citations attributed to the articles co-authored by Raoult, 18,145 come from articles of which he is a co-author. These self-citations amount to a total of 25 per cent. Subtracting these, Raoult’s h-index drops 13 per cent to a value of 104.</p>
<p>Now, let’s examine the case of Einstein, who has 147 articles listed in the Web of Science database between 1901 and 1955, the year of his death. For his 147 articles, Einstein has received 1,564 citations during his lifetime. Of this total number of citations, only 27, or a meagre 1.7 per cent, are self-citations. Now, if we add the citations made to his articles after his death, Einstein has received a total of 28,404 citations between 1901 and 2019, which earns him an h-index of 56.</p>
<p>If we have to rely on the so-called “objective” measurement provided by the h-index, we are then forced to conclude that the work of Raoult has twice the scientific impact of that of Einstein, the father of the photon, restricted and general relativities, the Bose-Einstein condensation and of the phenomenon of the stimulated emission at the origin of lasers. </p>
<p>Or maybe is it simpler (and better) to conclude, as already suggested, that this indicator is bogus?</p>
<p>One should note the significant difference in the number of total citations received by each of these researchers during their careers. They have obviously been active at very different times, and the size of scientific communities, and therefore the number of potential citing authors, have grown considerably over the past half-century. </p>
<p>Disciplinary differences and collaboration patterns must also be taken into account. For example, theoretical physics has far fewer contributors than microbiology, and the number of co-authors per article is smaller, which affects the measure of the productivity and impact of researchers.</p>
<p>Finally, it is important to note that the statement: “The h-index of person P is X,” has no meaning, because the value of the index depends on the content of the database used for its calculation. One should rather say: “The h-index of person P is X, in database Z.” Hence, according to the Web of Science database, which only contains journals considered to be serious and fairly visible in the scientific field, the h-index of Raoult is 120. On the other hand, in the free and therefore easily accessible database of Google Scholar, his h-index — the one most often repeated in the media — goes up to 179.</p>
<h2>Number fetishism</h2>
<p>Many scientific communities worship the h-index and this fetishism can have harmful consequences for scientific research. France, for instance, uses a <a href="https://www.sigaps.fr/">Système d’interrogation, de gestion et d’analyse des publications scientifiques</a> to grant research funds to its biomedical science laboratories. It is based on the number of articles they publish in so-called high impact factor journals. As reported by the newspaper <em>Le Parisien</em>, the frantic pace of Raoult’s publications <a href="https://www.leparisien.fr/societe/didier-raoult-une-frenesie-de-publications-et-des-pratiques-en-question-12-06-2020-8334405.php">allows his home institution to earn between 3,600 and 14,400 euros annually for each article published by his team</a>.</p>
<p>Common sense should teach us to be wary of simplistic and one-dimensional indicators. Slowing the maddening pace of scientific publications would certainly lead researchers to lose interest in the h-index. More importantly, abandoning it would contribute to producing scientific papers that will be fewer in number, but certainly more robust.</p>
<p><em>This is a corrected version of a story originally published on July 8, 2020. The earlier story said John Hirsch had invented the h-index instead of Jorge Hirsch.</em></p><img src="https://counter.theconversation.com/content/141684/count.gif" alt="La Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Yves Gingras a reçu des financements du CRSH et du FQRSC. </span></em></p><p class="fine-print"><em><span>Mahdi Khelfaoui a reçu des financements du CRSH</span></em></p>The h-index has become an indicator of quality for many researchers and may influence the allocation of research funds. But some question its value.Yves Gingras, Professeur d’histoire et de sociologie des sciences, Université du Québec à Montréal (UQAM)Mahdi Khelfaoui, Professeur associé, Université du Québec à Montréal (UQAM)Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1409802020-06-18T10:42:25Z2020-06-18T10:42:25ZCoronavirus: why it’s dangerous to blindly ‘follow the science’ when there’s no consensus yet<figure><img src="https://images.theconversation.com/files/342457/original/file-20200617-94101-n59kfy.jpg?ixlib=rb-1.1.0&rect=48%2C0%2C5348%2C3562&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Rules about coronavirus research have been relaxed.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/microbiologist-tube-biological-sample-contaminated-by-1644424099">angellodeco/Shutterstock</a></span></figcaption></figure><p>The Lancet and the New England Journal of Medicine are among the most influential scientific journals in the world. Both have recently had to <a href="https://www.theguardian.com/world/2020/jun/12/covid-19-studies-based-on-flawed-surgisphere-data-force-medical-journals-to-review-processes?fbclid=IwAR3eIgHL4SnbkVx7DkVTEM757eyD6cjF0aPkIFDHk8uIQaNyJwkI7isAPsg">retract studies</a> on the effectiveness of COVID-19 treatments after doubts were raised about the underlying data. The scandal reveals the dangers of <a href="https://www.thebsps.org/auxhyp/fast-science-stegenga/">“fast science”</a>. </p>
<p>In the face of the virus emergency, research standards <a href="https://www.scientificamerican.com/article/shortcuts-in-covid-19-drug-research-could-do-long-term-harm-bioethicists-worry/">have been relaxed</a> to encourage faster publication and mistakes become inevitable. This is risky. Ultimately, if expert advice on the pandemic turns out to be wrong, it will have dire consequences for how reliable scientific evidence is treated in other policy areas, such as climate change.</p>
<p>The pandemic <a href="https://www.politico.eu/article/boris-johnsons-coronavirus-fudge/">has become politicised</a>, pitting smug liberals versus reckless conservatives. There’s also a move towards thinking about options in terms of science versus common sense. If we accept this framing, we risk causing people to believe that experts are no better than the rest of us at making predictions and providing explanations that can guide policy.</p>
<p>For example, some “<a href="https://www.spectator.co.uk/article/This-lockdown-may-kill-me">lockdown sceptics</a>” have responded to falling death rates by arguing that the lockdown wasn’t necessary in the first place. Setting aside arguments over to what extent lockdowns saved lives, it is <a href="https://ftalphaville.ft.com/2020/04/15/1586943153000/Why-are-we-really-in-lockdown--/">right to worry</a> about the way this has cast aspersion on expertise more generally.</p>
<p>But we shouldn’t see the epidemiologists advising governments as having the same standing – in regard to the pandemic – as other experts have with regard to other hot-button issues that engage scientific consensus. It is misguided to think that, because epidemiology is a well-established science, the guidance it provides us with right now is necessarily perfectly reliable. </p>
<p>There is no reliable science – yet – of the novel coronavirus. Because it is novel, the models that the epidemiologists use must make assumptions based on incomplete data. </p>
<p>We have seen <a href="https://www.washingtonpost.com/health/2020/04/06/americas-most-influential-coronavirus-model-just-revised-its-estimates-downward-not-every-model-agrees/">dramatic revisions</a> in these models as some of the assumptions came to be seen to be completely off-base. Even now, there is good reason to worry that some of the models governments rely on may exaggerate the infection fatality rate. Testing has concentrated on the most sick — but if others infected with mild or no symptoms were factored into the calculations, <a href="https://theconversation.com/coronavirus-bmj-study-suggests-78-dont-show-symptoms-heres-what-that-could-mean-135732">the fatality rate would be smaller</a>, by a currently unknown amount. </p>
<p>Part of the underlying problem is built into the way epidemiology is organised to deal with new, unfolding disease in a fast-moving environment. Leading epidemiologists <a href="https://bostonreview.net/science-nature/marc-lipsitch-good-science-good-science">see themselves as synthesisers</a> of “many branches of science using many methods, approaches, and forms of evidence”. But it takes time to collect and combine such evidence. </p>
<h2>Lives versus the economy</h2>
<p>Epidemiology is not the only discipline relevant to the response to the pandemic. Lockdowns themselves have costs, of an unknown magnitude. Too often, these costs are presented as economic costs, <a href="https://www.theguardian.com/commentisfree/2020/mar/14/coronavirus-is-first-a-health-problem-second-an-economic-one">as if we faced a choice</a> between a healthy economy and healthy people. But people <a href="https://www.cambridge.org/core/journals/the-british-journal-of-psychiatry/article/economic-suicides-in-the-great-recession-in-europe-and-north-america/DF85FA16DFB256F4DC7937FAEA156F8B">die from recessions</a>. </p>
<p>We should frame the issue as one pitting <a href="https://voxeu.org/article/india-s-lockdownhttps:/voxeu.org/article/india-s-lockdown">lives against lives</a>, not lives against the economy. Estimating the effects of lockdowns on future deaths and illness, physical and mental, is not a matter for epidemiologists alone but for a variety of disciplines – psychiatrists, sociologists, economists, educators, public health experts and many others.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/342459/original/file-20200617-94049-9csllb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/342459/original/file-20200617-94049-9csllb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/342459/original/file-20200617-94049-9csllb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/342459/original/file-20200617-94049-9csllb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/342459/original/file-20200617-94049-9csllb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/342459/original/file-20200617-94049-9csllb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/342459/original/file-20200617-94049-9csllb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Lockdown threatens lives and livelihoods.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/business-center-closed-due-covid19-sign-1698114358">Viacheslav Lopatin/Shutterstock</a></span>
</figcaption>
</figure>
<p>Coming to a reliable consensus takes time and the input of many disciplines, especially because the consequences of any policy affect so many areas of life. There simply has <a href="https://climate.nasa.gov/scientific-consensus/">not yet been enough time</a> for such a consensus to emerge.</p>
<h2>Implications for climate science</h2>
<p>Climate science looms over the pandemic debates and offers an example of the value of tested science in public policy debates. From the beginning of the crisis, many have worried that conceding anything to those with reservations about following the authority of science will play into the hands of climate sceptics.</p>
<p>There is every reason to believe that the strong consensus that exists with regard to climate science is fully justified. A central part of the reason that the consensus is trustworthy is that it has been stress-tested so many times from so many angles. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/342461/original/file-20200617-94078-1qgjibn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/342461/original/file-20200617-94078-1qgjibn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/342461/original/file-20200617-94078-1qgjibn.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/342461/original/file-20200617-94078-1qgjibn.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/342461/original/file-20200617-94078-1qgjibn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/342461/original/file-20200617-94078-1qgjibn.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/342461/original/file-20200617-94078-1qgjibn.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Climate science is tried and tested.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/mother-polar-bear-cub-jumping-across-155225642">FloridaStock/Shutterstock</a></span>
</figcaption>
</figure>
<p>Scientific claims like “carbon emissions cause global heating” are not the province of any one discipline. Rather, the expertise of many disciplines is needed: physicists, paleoclimatologists, mathematicians, astronomers <a href="https://bravenewclimate.com/2008/08/31/so-just-who-does-climate-science/">and many more</a> have contributed to making climate science robust. All these experts are required to identify mechanisms, rule out alternative explanations and make predictions. </p>
<p>Like epidemiology, climate science provides a reliable guide to policy. But it is reliable mainly because its predictions and assumptions are further tested and assessed by many disciplines beyond climate science proper.</p>
<p>We strongly advocate giving scientific input into policy significant weight. Though in this case that advice can reflect only some of the science and offers a partial picture. Taking that advice is taking a bet, and we should not be very surprised if we lose that bet in ways we only dimly understand in advance. The stakes of this bet are especially high when taking the advice requires suspending some civil rights.</p>
<p>If we do lose the bet, having framed the debate as one of experts versus sceptics will lead to a victory for that latter. That would set back our response to issues that rely on scientific certainty, especially climate change, by decades.</p>
<p>Science is our best guide to the world. But reliable science takes time and contributions by many different kinds of people, including the values of the public. We should celebrate the achievements of science, but recognise that not all science is equally warranted.</p><img src="https://counter.theconversation.com/content/140980/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Eric Schliesser receives funding from Netherlands Organisation of Scientific Research (NWO)</span></em></p><p class="fine-print"><em><span>Eric Winsberg and Neil Levy do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>If expert advice on the pandemic turns out to be wrong, it will have dire consequences for how reliable scientific evidence is treated in other policy areas, such as climate change.Neil Levy, Senior Research Fellow, Uehiro Centre for Practical Ethics, University of OxfordEric Schliesser, Professor of Political Science., University of AmsterdamEric Winsberg, Professor of Philosophy of Science, University of South FloridaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1223532019-09-23T12:31:49Z2019-09-23T12:31:49ZHow an AI trained to read scientific papers could predict future discoveries<figure><img src="https://images.theconversation.com/files/293402/original/file-20190920-135074-11qbsiy.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Eureka!</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/3d-rendering-robot-learning-machine-education-1058815598?src=CWZD1m_pLOpKt4N9f46eyw-1-88">Phonlamai Photo/Shutterstock </a></span></figcaption></figure><p>“Can machines think?”, asked the famous mathematician, code breaker and computer scientist <a href="https://theconversation.com/imitation-game-brings-to-life-the-real-alan-turing-pioneer-of-the-computer-age-32517">Alan Turing</a> almost 70 years ago. Today, some experts have no doubt that Artificial Intelligence (AI) will <a href="https://www.theverge.com/2018/11/27/18114362/ai-artificial-general-intelligence-when-achieved-martin-ford-book">soon be able to</a> develop the kind of <a href="https://theconversation.com/ai-develops-human-like-number-sense-taking-us-a-step-closer-to-building-machines-with-general-intelligence-116820">general intelligence</a> that humans have. But others argue that machines will never measure up. Although AI can already outperform humans on certain tasks – just like calculators – they can’t be taught human creativity. </p>
<p>After all, our ingenuity, which is sometimes <a href="https://theconversation.com/anthill-25-intuition-96677">driven by passion and intuition</a> rather than logic and evidence, has enabled us to make spectacular discoveries – ranging from vaccines to fundamental particles. Surely an AI won’t ever be able to compete? Well, it turns out they might. A paper <a href="https://perssongroup.lbl.gov/papers/dagdelen-2019-word-embeddings.pdf">recently published in Nature</a> reports that an AI has now managed to predict future scientific discoveries by simply extracting meaningful data from research publications. </p>
<p>Language has a deep connection with thinking, and it has shaped human societies, relationships and, ultimately, intelligence. Therefore, it is not surprising that the holy grail of AI research is the <a href="https://theconversation.com/ai-theres-a-reason-its-so-bad-at-conversation-103249">full understanding of human language</a> in all its nuances. <a href="https://en.wikipedia.org/wiki/Natural_language_processing">Natural Language Processing</a> (NLP), which is part of a much larger umbrella called machine learning, aims to assess, extract and evaluate information from textual data. </p>
<p>Children learn by interacting with the surrounding world via trial and error. Learning how to ride a bicycle often involves a few bumps and falls. In other words, we make mistakes and we learn from them. This is precisely the way <a href="https://www.scientificamerican.com/article/springtime-for-ai-the-rise-of-deep-learning/">machine learning operates</a>, sometimes with some extra “educational” input (supervised machine learning). </p>
<p>For example, an AI can learn to recognise objects in images by building up a picture of an object from many individual examples. Here, a human must show it images containing the object or not. The computer then makes a guess as to whether it does, and adjusts its statistical model according to the accuracy of the guess, as judged by the human. However we can also leave the computer program to do all the relevant learning by itself (unsupervised machine learning). Here, AI automatically starts being able to detect patterns in data. In either case, a computer program needs to find a solution by evaluating how wrong it is, and then try to adjust it to minimise such error.</p>
<p>Suppose we want to understand some properties related to a specific material. The obvious step is to search for information from books, web pages and any other appropriate resources. However, this is time consuming, as it may involve hours of web searching, reading articles and specialised literature. NLP can, however, help us. Via sophisticated methods and techniques, computer programs can identify concepts, mutual relationships, general topics and specific properties from large textual datasets. </p>
<p>In the new study, an AI learned to retrieve information from scientific literature via unsupervised learning. This has remarkable implications. So far, most of the existing automated NLP-based methods are supervised, requiring input from humans. Despite being an improvement compared to a purely manual approach, this is still a labour intensive job.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/worried-about-ai-taking-over-the-world-you-may-be-making-some-rather-unscientific-assumptions-103561">Worried about AI taking over the world? You may be making some rather unscientific assumptions</a>
</strong>
</em>
</p>
<hr>
<p>However, in the new study, the researchers created a system that could accurately identify and extract information independently. It used sophisticated techniques based on statistical and geometrical properties of data to identify chemical names, concepts and structures. This was based on about 1.5m abstracts of scientific papers on material science. </p>
<p>A machine learning program then classified words in the data based on specific features such as “elements”, “energetics” and “binders”. For example, “heat” was classified as part of “energetics”, and “gas” as “elements”. This helped connect certain compounds with types of magnetism and similarity with other materials among other things, providing an insight on how the words were connected with no human intervention required. </p>
<h2>Scientific discoveries</h2>
<p>This method could capture complex relationships and identify different layers of information, which would be virtually impossible to carry out by humans. It provided insights well in advance compared to what scientists can predict at the moment. In fact, the AI could recommend materials for functional applications several years before their actual discovery. There were five such predictions, all based on papers published before the year 2009. For example, the AI managed to identify a substance known as CsAgGa2Se4as as a <a href="https://en.wikipedia.org/wiki/Thermoelectric_materials">thermoelectric material</a>, which scientists only discovered in 2012. So if the AI had been around in 2009, it could have speeded up the discovery.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/293472/original/file-20190922-135128-6qb0a4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/293472/original/file-20190922-135128-6qb0a4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=457&fit=crop&dpr=1 600w, https://images.theconversation.com/files/293472/original/file-20190922-135128-6qb0a4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=457&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/293472/original/file-20190922-135128-6qb0a4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=457&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/293472/original/file-20190922-135128-6qb0a4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=574&fit=crop&dpr=1 754w, https://images.theconversation.com/files/293472/original/file-20190922-135128-6qb0a4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=574&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/293472/original/file-20190922-135128-6qb0a4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=574&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Thermoelectric seebeck power module.</span>
<span class="attribution"><span class="source">wikipedia</span>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>It made the prediction by connecting the compound with words such as “chalcogenide” (material containing “<a href="https://chem.libretexts.org/Bookshelves/Inorganic_Chemistry/Supplemental_Modules_(Inorganic_Chemistry)/Descriptive_Chemistry/Elements_Organized_by_Block/2_p-Block_Elements/Group_16%3A_The_Oxygen_Family_(The_Chalcogens)">chalcogen elements</a>” such as sulfur or selenium), “<a href="https://epsrc.ukri.org/research/ourportfolio/researchareas/optoelec/">optoelectronic</a>” (electronic devices that source, detect and control light) and “<a href="https://en.wikipedia.org/wiki/Photovoltaics">photovoltaic applications</a>”. Many thermoelectric materials share such properties, and the AI was quick to show that.</p>
<p>This suggests that latent knowledge regarding future discoveries is to a large extent embedded in past publications. AI systems are becoming more and more independent. And there is nothing to fear. They can help us enormously to navigate through the huge amount of data and information, which is being continuously created by human activities. Despite concerns related to privacy and security, AI is changing our societies. I believe it will lead us to make better decisions, improve our daily lives and ultimately make us smarter.</p><img src="https://counter.theconversation.com/content/122353/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Marcello Trovati receives funding from Innovate UK. I am currently leading some Knowledge Transfer Partnerships (KTPs). </span></em></p>Creativity isn’t the only route to discovery – automated analysis of huge amounts of data works too.Marcello Trovati, Reader in Computer Science, Edge Hill UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1203232019-07-15T12:03:16Z2019-07-15T12:03:16ZUniversity of California’s showdown with the biggest academic publisher aims to change scholarly publishing for good<figure><img src="https://images.theconversation.com/files/283903/original/file-20190712-173338-1gov2o5.jpg?ixlib=rb-1.1.0&rect=311%2C0%2C4985%2C3581&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">For now, it's going to be trickier for the University of California community to access some academic journals.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/michelle658/6022758297">Michelle/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>This month, academic publisher Elsevier <a href="https://www.insidehighered.com/quicktakes/2019/07/11/elsevier-ends-journal-access-uc-system">shuttered</a> the University of California’s online access to current journal articles. It’s the latest move in the high stakes <a href="https://www.latimes.com/business/hiltzik/la-fi-uc-elsevier-20190711-story.html">standoff</a> between Elsevier, the <a href="https://www.latimes.com/business/hiltzik/la-fi-hiltzik-uc-elsevier-20181207-story.html">world’s largest publisher of scholarly research</a>, and the University of California, whose scholars <a href="https://accountability.universityofcalifornia.edu/2018/chapters/chapter-9.html">produce about 10%</a> of the nation’s research publications.</p>
<p>Last February, Elsevier chose to continue providing access to journals via its <a href="https://www.sciencedirect.com">ScienceDirect</a> online platform after UC’s <a href="https://www.universityofcalifornia.edu/press-room/uc-terminates-subscriptions-worlds-largest-scientific-publisher-push-open-access-publicly">subscription expired</a> and negotiations broke down. With its instant access now cut off, the UC research community will learn firsthand what it’s like to rely on the open web and other means of accessing critical research.</p>
<p>The UC-Elsevier showdown made headlines because it’s symptomatic of the way the internet has failed to deliver on the promise to make knowledge easily accessible and shareable by anyone, anywhere in the world. It’s the latest in a succession of cracks in what is widely considered to be a <a href="https://www.researchinformation.info/news/new-report-warns-%E2%80%98failing-system%E2%80%99-scholarly-publishing">failing system</a> for sharing academic research. <a href="https://leadership.ucdavis.edu/people/mackenzie-smith">As the head of the research library at UC Davis</a>, I see this development as a <a href="https://osc.universityofcalifornia.edu/open-access-at-uc/publisher-negotiations/uc-and-elsevier-impact/">harbinger of a tectonic shift</a> in how universities and their faculty share research, build reputations and preserve knowledge in the digital age.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=464&fit=crop&dpr=1 600w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=464&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=464&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=583&fit=crop&dpr=1 754w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=583&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=583&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Accessing a journal no longer means going to a periodicals room.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/boston_public_library/13873472463/in/faves-52792775@N00/">Newton W. Elwell/Boston Public Library/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>Moving from stacks to screens</h2>
<p>Here’s how things traditionally worked.</p>
<p>Universities have always subscribed to scientific journals so their researchers can study and build on the work that came before, and won’t needlessly duplicate research they never knew about. In the print age, university library shelves were lined with journals, available for any researcher or – in the case of public universities like the University of California – any member of the public to peruse and learn from.</p>
<p>Now, for almost all journals, and a growing number of books, libraries sign contracts to license access to digital versions. Since academic publishers moved their journals online, it has become rare for libraries to subscribe to printed journals, and researchers have adapted to the convenience of accessing journal articles on the internet.</p>
<p>Under the new business model of licensing access to journals online rather than distributing them in print, for-profit publishers often lock libraries into bundled subscriptions that wrap the majority of a publisher’s portfolio of journals – <a href="https://www.elsevier.com/books-and-journals">almost 3,000 in Elsevier’s case</a> – into a single, multimillion dollar package. Rather than storing back issues on shelves, libraries can lose permanent access to journals when a contract expires. And members of the public can no longer read the library’s copy of a journal because the licenses are limited to members of the university. Now the public must buy online copies of academic articles for an average of US$35 to $40 a pop. </p>
<p>The shift to digital has been good for researchers in many ways. It is far more convenient to search for articles online, and easier to access and download a copy – provided you work for an institution with a paid subscription. Modern software makes organizing and annotating them simpler, too. With all of these benefits, no one would advocate for going back to the old days of print journals. </p>
<p>Online access to journals did not improve the picture overall. Despite digital copies of articles costing nothing to duplicate and the cost of producing an article online being lower than in the past, the cost to libraries of licensing access to them has continued to experience <a href="https://www.arl.org/wp-content/uploads/2018/09/5yr_ongoing-resource-expenditures_by_type.pdf">hyperinflation</a>. <a href="https://theconversation.com/academic-journal-publishing-is-headed-for-a-day-of-reckoning-80869">No library</a> can afford to license all the journals its faculty and students want access to, and many researchers around the world are shut out completely. Compounding the problem, <a href="https://doi.org/10.1371/journal.pone.0127502">consolidation in the scholarly publishing market</a> has reduced competition significantly, causing even more price inflexibility.</p>
<h2>Excessive profits?</h2>
<p>Academic publishers certainly bear costs. They pay for professional editors and programmers, they manage the peer review process, they market their journals and so on. However, their revenues far exceed these costs and are among the highest of any companies in the world. Elsevier’s profit margin is reported to be <a href="https://www.forbes.com/sites/kittyknowles/2018/06/13/blockchain-science-iris-ai-project-aiur-elsevier-academic-journal-london-tech-week-cogx/">nearly 40%</a>, far higher than even Apple at <a href="https://www.macrotrends.net/stocks/charts/AAPL/apple/profit-margins">around 23%</a>.</p>
<p>Where social media platforms like Facebook profit from – and indeed, would not exist without – the content generated by users, <a href="https://www.theguardian.com/commentisfree/2019/mar/04/the-guardian-view-on-academic-publishing-disastrous-capitalism">the parallel is true</a> for academic journals. Companies like Elsevier receive articles from university faculty and other researchers for free, summarizing research that was often publicly funded by government grants. Then other faculty and researchers serve on their editorial boards and peer-review those articles for free or a nominal fee. Finally the company publishes them in journals available only behind a paywall.</p>
<p>And there’s the rub: the paywall. The great promise of the internet was that it would make <a href="https://en.wikipedia.org/wiki/Information_wants_to_be_free">knowledge more freely and easily accessible</a>. In the world of academic research – where new discoveries are made and new knowledge is born – the hope 20 years ago was that the advent of online platforms would make research articles universally available. It would also bring down the cost of publishing scholarly journals and, consequently, begin to reduce the multimillion dollar subscription costs borne by universities and other research institutions.</p>
<p>Instead, articles are not readily available to everyone, subscription costs have continued to rise, and subscribers’ rights have eroded, including what they can do with articles they buy and their ability to provide long-term access to them.</p>
<p>What happened to sharing knowledge with the people who need it, funded or created it?</p>
<h2>A model for the digital age</h2>
<p>Maybe it’s time to just blow up the whole system and start over. But today’s scholarly system of sharing knowledge evolved over hundreds of years and contains certain qualities – peer review of accuracy, editorial judgment, long-term preservation – that still matter deeply.</p>
<p>While research products – books, journals and articles – would definitely benefit from modernization in the digital age, we at the University of California are focusing on fixing the business model first. Paywalls and online subscriptions may make sense in other parts of the media ecosystem, but it’s not a good model for academic publishing, where authors and reviewers are paid by universities and research grants (with public money) rather than by publishers.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=313&fit=crop&dpr=1 600w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=313&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=313&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=393&fit=crop&dpr=1 754w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=393&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=393&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The open access movement would get rid of paywalls and let anyone read anything for free.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/set-ornate-gates-heaven-opening-under-158678495">Inked Pixels/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>Fortunately, the academy has another option for a publishing business model that can better achieve the promise of the internet: open access. In that model, authors, or their funders or institutions, pay the publisher a fee to cover the cost of publishing each article. In exchange, the articles are made freely available for everyone to read online, anywhere, anytime. Article quality is preserved by the same unpaid peer-review system. Libraries at research institutions could shift their payments from licenses and subscriptions to publication fees for their affiliated authors. The cost is theoretically the same, but everyone can read everything for free. </p>
<p>The University of California has long <a href="https://osc.universityofcalifornia.edu/open-access-at-uc/open-access-policy/">supported the ideals of open access</a> – allowing everyone in the world to access the knowledge created by its faculty and researchers, for the benefit of all. In fact, since open access became an option for publishing, more and more UC authors, following the <a href="https://doi.org/10.7717/peerj.4375">global trend</a>, have independently chosen to pay their publisher a fee in order to make their article freely available to the public.</p>
<p>But those fees come on top of the <a href="https://www.universityofcalifornia.edu/news/why-uc-split-publishing-giant-elsevier">tens of millions of dollars</a> that the university is already paying the publisher for access to the same articles. This “<a href="https://www.wsj.com/articles/the-science-of-the-tax-dollar-double-dip-1459379449">double dipping</a>” by publishers was the final straw in UC’s resolve to change the system.</p>
<p>Several years ago, <a href="https://scholar.google.com/citations?user=_Sp-B_0AAAAJ&hl=en&oi=ao">I worked</a> with colleagues within the University of California and other academic research institutions to study the <a href="https://www.library.ucdavis.edu/icis/uc-pay-it-forward-project/">costs of publishing with this open access model</a>. We found that, while costs would shift and more research-grant funds may need to be applied to publishing fees, <a href="http://dx.doi.org/10.5703/1288284316481">overall it would be affordable</a> for research universities, at least in North America where libraries are relatively well funded. With these results, UC could see a way out of its dilemma. </p>
<p>When UC’s contract with Elsevier was up for renewal, we resolved to put our ideas into practice and pursue the twin goals of increasing open access to UC’s research while containing or lowering our journal-related costs – and finally achieving something of the promise of the internet. While I was not on UC’s negotiating team, I was among the group of faculty and library leaders that worked closely with them and decided to take this step.</p>
<p>Now, UC’s researchers will have to find other ways to get Elsevier journal articles than the online access they have become accustomed to. Many of those articles are already <a href="https://www.the-scientist.com/daily-news/open-access-on-the-rise-study-31125">freely available</a> on the web and the rest can be borrowed from libraries or requested from authors. There are also a growing number of tools like <a href="https://unpaywall.org/">Unpaywall</a>, which searches the web for free copies of articles, to help researchers with that transition. But for busy researchers with little time to spare, convenience is king, and they’ll likely soon learn from experience why achieving 100% open access to research articles is so important.</p>
<p>UC’s goals are ambitious and their implementation will be complex. Changing a system this intricate is akin to modernizing the FAA’s air traffic control system – a million planes are in the air at any moment and altering anything can have serious consequences elsewhere. But we have to start somewhere or the whole system is at risk, and UC has placed its bet. We join an expanding <a href="https://oa2020.org/mission/#eois">global movement</a>, and we believe we’re now on the path to a better system for sharing knowledge in the 21st century.</p>
<hr>
<p><em>This is an updated version of <a href="https://theconversation.com/university-of-californias-break-with-the-biggest-academic-publisher-could-shake-up-scholarly-publishing-for-good-112941">an article</a> originally published on March 7, 2019.</em></p><img src="https://counter.theconversation.com/content/120323/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>MacKenzie Smith receives funding from the Andrew W. Mellon Foundation, the Institute of Museum and Library Services and the National Science Foundation. </span></em></p>The UC libraries let their Elsevier journal subscriptions lapse and now the publisher has cut their online access. It’s a painful milestone in the fight UC hopes may transform how journals get paid.MacKenzie Smith, University Librarian and Vice Provost for Digital Scholarship, University of California, DavisLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1129412019-03-08T01:28:07Z2019-03-08T01:28:07ZUniversity of California’s break with the biggest academic publisher could shake up scholarly publishing for good<figure><img src="https://images.theconversation.com/files/262542/original/file-20190306-100784-oqhxay.jpg?ixlib=rb-1.1.0&rect=155%2C248%2C2355%2C1691&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Libraries subscribe digitally to academic journals – and are left with nothing in the stacks when the contract expires.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/maveric2003/137231015">Eric Chan/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>The University of California recently made international headlines when it <a href="https://www.universityofcalifornia.edu/press-room/uc-terminates-subscriptions-worlds-largest-scientific-publisher-push-open-access-publicly">canceled its subscription</a> with scientific journal publisher <a href="https://www.elsevier.com/">Elsevier</a>. The twittersphere <a href="https://twitter.com/hashtag/ucelsevier">lit up</a>. And Elsevier’s parent company, RELX, saw its stock <a href="https://uk.reuters.com/article/uk-britain-stocks/uk-main-index-bounces-back-on-wpp-strength-relx-tumbles-idUKKCN1QI42N">drop 7 percent</a> in response to the announcement.</p>
<p>A library canceling a subscription seems like a simple, everyday business decision, so what’s the big deal?</p>
<p>It was not just the clash-of-the-titans drama between the University of California, whose scholars <a href="https://accountability.universityofcalifornia.edu/2018/chapters/chapter-9.html">produce nearly 10 percent</a> of the nation’s research publications, and Elsevier, the <a href="https://www.latimes.com/business/hiltzik/la-fi-hiltzik-uc-elsevier-20181207-story.html">world’s largest publisher</a> of academic research. </p>
<p>The story made headlines because it’s symptomatic of the way in which the internet has failed to deliver on the promise to make knowledge easily accessible and shareable by anyone, anywhere in the world. The UC-Elsevier showdown was the latest in a succession of cracks in what is widely considered to be a <a href="https://www.researchinformation.info/news/new-report-warns-%E2%80%98failing-system%E2%80%99-scholarly-publishing">failing system</a> for sharing academic research. <a href="https://leadership.ucdavis.edu/people/mackenzie-smith">As the head of the research library at UC Davis</a>, I see this development as a harbinger of a tectonic shift in how universities and their faculty share research, build reputations and preserve knowledge in the digital age.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=464&fit=crop&dpr=1 600w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=464&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=464&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=583&fit=crop&dpr=1 754w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=583&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/262544/original/file-20190306-100796-1cwusvb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=583&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Accessing a journal no longer means going to a periodicals room.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/boston_public_library/13873472463/in/faves-52792775@N00/">Newton W. Elwell/Boston Public Library/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>Moving from stacks to screens</h2>
<p>Here’s how things traditionally worked.</p>
<p>Universities have always subscribed to scientific journals so their researchers can study and build on the work that came before, and won’t needlessly duplicate research they never knew about. In the print age, university library shelves were lined with journals, available for any researcher or – in the case of public universities like the University of California – any member of the public to peruse and learn from.</p>
<p>Now, for almost all journals, and a growing number of books, libraries sign contracts to license access to digital versions. Since academic publishers moved their journals online, it has become rare for libraries to subscribe to printed journals, and researchers have adapted to the convenience of accessing journal articles on the internet.</p>
<p>Under the new business model of licensing access to journals online rather than distributing them in print, for-profit publishers often lock libraries into bundled subscriptions that wrap the majority of a publisher’s portfolio of journals – <a href="https://www.elsevier.com/books-and-journals">almost 3,000 in Elsevier’s case</a> – into a single, multi-million dollar package. Rather than storing back issues on shelves, libraries can lose permanent access to journals when a contract expires. And members of the public can no longer read the library’s copy of a journal because the licenses are limited to members of the university. Now the public must buy online copies of academic articles for an average of US$35 to $40 a pop. </p>
<p>The shift to digital has been good for researchers in many ways. It is far more convenient to search for articles online, and easier to access and download a copy – provided you work for an institution with a paid subscription. Modern software makes organizing and annotating them simpler, too. With all of these benefits, no one would advocate for going back to the old days of print journals. </p>
<p>But this online system did not improve the picture overall. Despite digital copies of articles costing nothing to duplicate and the cost of producing an article online being lower than in the past, the cost to libraries of licensing access to them has continued to experience <a href="https://www.arl.org/storage/documents/5yr_ongoing-resource-expenditures_by_type.pdf">hyperinflation</a>. <a href="https://theconversation.com/academic-journal-publishing-is-headed-for-a-day-of-reckoning-80869">No library</a> can afford to license all of the journals that its faculty and students want access to, and many researchers around the world are shut out completely. Compounding the problem, <a href="https://doi.org/10.1371/journal.pone.0127502">consolidation in the scholarly publishing market</a> has reduced competition significantly, causing even more price inflexibility.</p>
<h2>Excessive profits?</h2>
<p>Academic publishers certainly bear costs. They pay for professional editors and programmers, they manage the peer review process, they market their journals and so on. However, their revenues far exceed these costs and are among the highest of any companies in the world. Elsevier’s profit margin is reported to be <a href="https://www.forbes.com/sites/kittyknowles/2018/06/13/blockchain-science-iris-ai-project-aiur-elsevier-academic-journal-london-tech-week-cogx/">nearly 40 percent</a> far higher than even Apple at <a href="https://www.macrotrends.net/stocks/charts/AAPL/apple/profit-margins">around 23 percent</a>.</p>
<p>Where social media platforms like Facebook profit from – and indeed, would not exist without – the content generated by users, <a href="https://www.theguardian.com/commentisfree/2019/mar/04/the-guardian-view-on-academic-publishing-disastrous-capitalism">the parallel is true</a> for academic journals. Companies like Elsevier receive articles from university faculty and other researchers for free, summarizing research that was often publicly funded by government grants. Then other faculty and researchers serve on their editorial boards and peer-review those articles for free or a nominal fee. Finally the company publishes them in journals available only behind a paywall.</p>
<p>And there’s the rub: the paywall. The great promise of the internet was that it would make <a href="https://en.wikipedia.org/wiki/Information_wants_to_be_free">knowledge more freely and easily accessible</a>. In the world of academic research – where new discoveries are made and new knowledge is born – the hope 20 years ago was that the advent of online platforms would make research articles universally available. It would also bring down the cost of publishing scholarly journals and, consequently, begin to reduce the multi-million dollar subscription costs borne by universities and other research institutions.</p>
<p>Instead, articles are not readily available to everyone, subscription costs have continued to rise, and subscribers’ rights have eroded, including what they can do with articles they buy and their ability to provide long-term access to them.</p>
<p>What happened to sharing knowledge with the people who need it, funded or created it?</p>
<h2>A model for the digital age</h2>
<p>Maybe it’s time to just blow up the whole system and start over. But the system of sharing knowledge that scholars have today evolved over hundreds of years and contains certain qualities – peer review of accuracy, editorial judgment, long-term preservation – that still matter deeply.</p>
<p>While research products – books, journals and articles – would definitely benefit from modernization in the digital age, we at the University of California are focusing on fixing the business model first. Paywalls and online subscriptions may make sense in other parts of the media ecosystem, but it’s not a good model for academic publishing, where authors and reviewers are paid by universities and research grants (with public money) rather than by publishers.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=313&fit=crop&dpr=1 600w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=313&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=313&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=393&fit=crop&dpr=1 754w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=393&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/262775/original/file-20190307-82688-17ky3w8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=393&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The open access movement would get rid of paywalls and let anyone read anything for free.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/set-ornate-gates-heaven-opening-under-158678495">Inked Pixels/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>Fortunately, the academy has another option for a publishing business model that can better achieve the promise of the internet: open access. In that model, authors, or their funders or institutions, pay the publisher a fee to cover the cost of publishing each article. In exchange, the articles are made freely available for everyone to read online, anywhere, anytime. Article quality is preserved by the same unpaid peer-review system. Libraries at research institutions could shift their payments from licenses and subscriptions to publication fees for their affiliated authors. The cost is theoretically the same, but everyone can read everything for free. </p>
<p>The University of California has long <a href="https://osc.universityofcalifornia.edu/open-access-at-uc/open-access-policy/">supported the ideals of open access</a> – allowing everyone in the world to access the knowledge created by its faculty and researchers, for the benefit of all. In fact, since open access became an option for publishing, more and more UC authors, following the <a href="https://doi.org/10.7717/peerj.4375">global trend</a>, have independently chosen to pay their publisher a fee in order to make their article freely available to the public.</p>
<p>But those fees come on top of the <a href="https://www.universityofcalifornia.edu/news/why-uc-split-publishing-giant-elsevier">tens of millions of dollars</a> that the university is already paying the publisher for access to the same articles. This “<a href="https://www.wsj.com/articles/the-science-of-the-tax-dollar-double-dip-1459379449">double dipping</a>” by publishers was the final straw in UC’s resolve to change the system.</p>
<p>Several years ago, <a href="https://scholar.google.com/citations?user=_Sp-B_0AAAAJ&hl=en&oi=ao">I worked</a> with colleagues within the University of California and other academic research institutions to study the <a href="https://www.library.ucdavis.edu/icis/uc-pay-it-forward-project/">costs of publishing with this open access model</a>. We found that, while costs would shift and more research-grant funds may need to be applied to publishing fees, <a href="http://dx.doi.org/10.5703/1288284316481">overall it would be affordable</a> for research universities, at least in North America where libraries are relatively well funded. With these results, UC could see a way out of its dilemma. </p>
<p>When UC’s contract with Elsevier was up for renewal, we resolved to put our ideas into practice and pursue the twin goals of increasing open access to UC’s research while containing or lowering our journal-related costs – and finally achieving something of the promise of the internet. While I was not on UC’s negotiating team, I was among the group of faculty and library leaders that worked closely with them and decided to take this step.</p>
<p>Our goals are ambitious and their implementation will be complex. Changing a system this intricate is akin to modernizing the FAA’s air traffic control system – a million planes are in the air at any moment and changing anything can have serious consequences elsewhere. But we have to start somewhere or the whole system is at risk, and UC has placed its bet. We join a <a href="https://oa2020.org/mission/#eois">global movement</a> that began in Europe and is expanding around the world, and we believe we’re now on the path to a better system for sharing knowledge in the 21st century.</p><img src="https://counter.theconversation.com/content/112941/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>MacKenzie Smith receives funding from Mellon Foundation, IMLS, NSF. </span></em></p>Digital publishing hasn’t resulted in the free and open access to information many envisioned. Universities are increasingly fed up with a system they see as charging them for their own scholars’ labor.MacKenzie Smith, University Librarian and Vice Provost for Digital Scholarship, University of California, DavisLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1042002018-10-25T10:48:06Z2018-10-25T10:48:06ZOverhype and ‘research laundering’ are a self-inflicted wound for social science<figure><img src="https://images.theconversation.com/files/242169/original/file-20181024-71026-1rxjnlj.jpg?ixlib=rb-1.1.0&rect=11%2C259%2C3733%2C2945&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Overselling slim results can get research findings into the hands of news consumers.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-vector/vintage-newspaper-boy-shouting-latest-news-258824045">durantelallera/Shutterstock.com</a></span></figcaption></figure><p>Earlier this fall, Dartmouth College researchers released a study <a href="https://doi.org/10.1073/pnas.1611617114">claiming to link</a> violent video games to aggression in kids. The logic of a meta-analytic study like this one is that by combining many individual studies, scientists can look for common trends or effects identified in earlier work. Only, as a psychology researcher who’s long focused on this area, I contend this meta-analysis did nothing of the sort. In fact, the magnitude of the effect they found is about the same as that of <a href="https://www.wired.com/story/its-time-for-a-serious-talk-about-the-science-of-tech-addiction/">eating potatoes on teen suicide</a>. If anything, it suggests video games do not predict youth aggression.</p>
<p>This study, and others like it, are symptomatic of a big problem within social science: the overhyping of dodgy, unreliable research findings that have little real-world application. Often such findings shape public perceptions of the human condition and <a href="https://www.law.cornell.edu/supct/pdf/08-1448P.ZO">guide public policy</a> – despite largely being rubbish. Here’s how it happens.</p>
<p>The last few years have seen psychology, in particular, embroiled in what some call a <a href="https://www.theatlantic.com/science/archive/2016/03/psychologys-replication-crisis-cant-be-wished-away/472272/">reproducibility crisis</a>. Many long-cherished findings in social science more broadly have <a href="https://www.washingtonpost.com/news/speaking-of-science/wp/2018/08/27/researchers-replicate-just-13-of-21-social-science-experiments-published-in-top-journals/">proven difficult</a> to replicate under rigorous conditions. When a study is run again, it doesn’t turn up the same results as originally published. The <a href="https://doi.org/10.1126/science.1255484">pressure to publish positive findings</a> and the tendency for researchers to <a href="https://doi.org/10.1126/science.1255484">inject their own biases</a> into analyses intensify the issue. Much of this failure to replicate can be addressed with more transparent and rigorous methods in social science.</p>
<p>But the overhyping of weak results is different. It can’t be fixed methodologically; a solution would need to come from a cultural change within the field. But incentives to be upfront about shortcomings are few, particularly for a field such as psychology, <a href="https://doi.org/10.1037/a0023963">which worries</a> over <a href="http://dx.doi.org/10.1037/a0039405">public perception</a>. </p>
<p>One example is the Implicit Association Test (IAT). This technique is most famous for probing for unconscious racial biases. Given the attention it and the theories based upon it have received, something of a cottage industry has developed to <a href="https://thinkprogress.org/starbucks-ceo-plans-racial-bias-training-89ba69933de2/">train employees about their implicit biases</a> and how to overcome them. Unfortunately, a number of studies suggest the IAT is <a href="https://www.thecut.com/2017/01/psychologys-racism-measuring-tool-isnt-up-to-the-job.html">unreliable and doesn’t predict real-world behavior</a>. Combating racial bias is laudatory, but the considerable public investment in the IAT and the concept of implicit biases is likely less productive than advertised.</p>
<p>Part of the problem is something I call “death by press release.” This phenomenon occurs when researchers or their university, or a journal-publishing organization such as the American Psychological Association, releases a press release that hypes a study’s findings without detailing its limitations. Sensationalistic claims tend to <a href="https://doi.org/10.1089/cyber.2017.0364">get more news attention</a>.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/242170/original/file-20181024-71017-1u3rl6d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/242170/original/file-20181024-71017-1u3rl6d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/242170/original/file-20181024-71017-1u3rl6d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=705&fit=crop&dpr=1 600w, https://images.theconversation.com/files/242170/original/file-20181024-71017-1u3rl6d.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=705&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/242170/original/file-20181024-71017-1u3rl6d.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=705&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/242170/original/file-20181024-71017-1u3rl6d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=886&fit=crop&dpr=1 754w, https://images.theconversation.com/files/242170/original/file-20181024-71017-1u3rl6d.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=886&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/242170/original/file-20181024-71017-1u3rl6d.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=886&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An easy tweak to get kids enthusiastically eating their veggies was too good to be true.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/happy-boy-carrot-healthy-food-concept-217315831">ilikestudio/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>For instance, one now notorious food lab at Cornell experienced <a href="https://www.washingtonpost.com/health/2018/09/20/this-ivy-league-food-scientist-was-media-darling-now-his-studies-are-being-retracted/">multiple retractions</a> after it came out that they tortured their data in order to get headline-friendly conclusions. Their research suggested that people ate more when served larger portions, action television shows increased food consumption, and kids’ vegetable consumption would go up if produce was rebranded with kid-friendly themes such as “X-ray vision carrots.” Mainly, lab leader Brian Wansink appears to have become an expert in <a href="https://slate.com/technology/2018/02/how-brian-wansink-forgot-the-difference-between-science-and-marketing.html">marketing social science</a>, even though most of the conclusions were flimsy. </p>
<p>Another concern is a process I call “science laundering” – the cleaning up of dirty, messy, inconclusive science for public consumption. In my own area of expertise, the Dartmouth meta-analysis on video games is a good example. <a href="https://doi.org/10.1177/1745691615592234">Similar evidence</a> to what had been fed into the meta-analysis had been available for years and actually formed the basis for <a href="https://doi.org/10.1111/jcom.12293">why most scholars</a> no longer link violent games to youth assaults.</p>
<p><a href="https://www.sciencemag.org/news/2018/09/meta-analyses-were-supposed-end-scientific-debates-often-they-only-cause-more">Science magazine</a> recently discussed how meta-analyses can be misused to try to prematurely end scientific debates. Meta-analyses can be helpful when they illuminate scientific practices that may cause spurious effects, in order to guide future research. But they can artificially smooth over important disagreements between studies.</p>
<p>Let’s say we hypothesize that eating blueberries cures depression. We run 100 studies to test this hypothesis. Imagine about 25 percent of our experiments find small links between blueberries and reduced depression, whereas the other 75 percent show nothing. Most people would agree this is a pretty poor showing for the blueberry hypothesis. The bulk of our evidence didn’t find any improvement in depression after eating the berries. But, due to a quirk of meta-analysis, combining all 100 of our studies together would show what scientists call a “statistically significant” effect – meaning something that was unlikely to happen just by chance – even though most of the individual studies on their own were not statistically significant.</p>
<p>Merging together even a few studies that show an effect with a larger group of studies that don’t can end up with a meta-analysis result that looks statistically significant – even if the individual studies varied quite a bit. These types of results constitute what some psychologists have called the “<a href="http://goodsciencebadscience.nl/?p=471">crud factor</a>” of psychological research – statistically significant findings that are noise, not real effects that reflect anything in the real world. Or, put bluntly, meta-analyses are a great tool for scholars to fool themselves with.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/242098/original/file-20181024-71035-tvpd3d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/242098/original/file-20181024-71035-tvpd3d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/242098/original/file-20181024-71035-tvpd3d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/242098/original/file-20181024-71035-tvpd3d.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/242098/original/file-20181024-71035-tvpd3d.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/242098/original/file-20181024-71035-tvpd3d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/242098/original/file-20181024-71035-tvpd3d.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/242098/original/file-20181024-71035-tvpd3d.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Nobody wants quality research to languish unseen in archives….</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/moonlightbulb/6307961852">Selena N. B. H./Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Professional guild organizations for fields such as psychology and pediatrics should shoulder much of the blame for the spread of research overhyping. Such organizations release numerous, <a href="https://doi.org/10.1016/0732-118X(95)00025-C">often deeply flawed</a>, policy statements trumpeting research findings in a field. The public often does not realize that such organizations function to market and <a href="https://psychcentral.com/blog/why-the-apa-is-losing-members/">promote a profession</a>; they’re not neutral, objective observers of scientific research – which is often published, <a href="http://ar2016.apa.org/financials/">for income</a>, in their own journals. </p>
<p>Unfortunately, such science laundering can come back to haunt a field when overhyped claims turn out to be misleading. Dishonest overpromotion of social science can cause the public and <a href="https://doi.org/10.4065/mcp.2010.0762">the courts</a> to grow more skeptical of it. Why should taxpayers fund research that is oversold rubbish? Why should media consumers trust what research says today if they were burned by what it said yesterday?</p>
<p>Individual scholars and the professional guilds that represent them can do much to fix these issues by reconsidering lax standards of evidence, the overselling of weak effects, and the current lack of upfront honesty about methodological limitations. In the meantime, the public will do well to continue applying a healthy dose of critical thinking to lofty claims coming from press releases in the social sciences. Ask if the magnitude of effect is significantly greater than for potatoes on suicide. If the answer is no, it’s time to move on.</p><img src="https://counter.theconversation.com/content/104200/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Christopher J. Ferguson does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Breathless press releases, over-interpreted meta-analyses and other ‘crud factors’ mean that weak research results can get overhyped to the public. It’s time for a cultural change in the social sciences.Christopher J. Ferguson, Professor of Psychology, Stetson University Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/929992018-03-08T11:40:18Z2018-03-08T11:40:18ZPerish not publish? New study quantifies the lack of female authors in scientific journals<figure><img src="https://images.theconversation.com/files/209364/original/file-20180307-146694-c6r2ty.jpg?ixlib=rb-1.1.0&rect=123%2C119%2C2110%2C1541&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It's not good if women's research isn't in the library stacks.</span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/9o8YdYGTT64">Redd Angelo on Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>“Publish or perish” is tattooed on the mind of every academic. Like it or loathe it, publishing in high-profile journals is the fast track to positions in prestigious universities with illustrious colleagues and lavish resources, celebrated awards and plentiful grant funding. Yet somehow, in the search to understand why <a href="https://www.nap.edu/catalog/11624/to-recruit-and-advance-women-students-and-faculty-in-science">women’s scientific careers often fail to thrive</a>, the role of high-impact journals has received little scrutiny. </p>
<p>One reason is that these journals don’t even collect data about the gender or ethnic background of their authors. To examine the representation of women within these journals, with our colleagues Jason Webster and <a href="https://scholar.google.com/citations?user=zoE5t6YAAAAJ&hl=en">Yuichi Shoda</a>, we delved into MEDLINE, the online repository that contains records of almost every published peer-reviewed neuroscience article. We used the <a href="https://genderize.io">Genderize.io</a> database to predict the gender of first and last authors on over 166,000 articles published between 2005 and 2017 in high-profile journals that include neuroscience, our own scientific discipline. The results were dispiriting.</p>
<h2>Female scientists underrepresented</h2>
<p>We began by looking at first authors – the place in the author list that traditionally is held by the junior researcher who does the hands-on research. We expected over <a href="https://www.sfn.org/careers-and-training/faculty-and-curriculum-tools/training-program-surveys">40 percent to be women</a>, similar to the percentage of women <a href="https://www.sfn.org/careers-and-training/faculty-and-curriculum-tools/training-program-surveys">postdocs in neuroscience</a> in the U.S. <a href="https://euraxess.ec.europa.eu/worldwide/japan/gender-equality-human-resources-research-and-marie-sklodowska-curie-actions">and Europe</a>. Instead, fewer than 25 percent first authors in the journals Nature and Science were women.</p>
<p>Our findings were similar for last authors, the place typically held by the laboratory leader. We expected the numbers to match large National Institutes of Health grants, which are a similarly rigorous measure of significance, scientific sophistication and productivity; <a href="https://nexus.od.nih.gov/all/2014/08/08/women-in-biomedical-research/">30 percent are awarded to women</a> – comparable to the proportion of <a href="http://www.sfn.org/nqmfp">women tenure-track faculty in neuroscience</a>. The proportion of women last authors was half what we expected – just over 15 percent of last authors in Science and Nature were women. </p>
<p><iframe id="E6h69" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/E6h69/8/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p><a href="https://doi.org/10.1101/275362">Our study, published online</a> and highlighted in a <a href="https://www.nature.com/articles/d41586-018-02833-1">letter printed in the journal Nature</a>, focused on neuroscience. We made <a href="https://goo.gl/x4s1iE">our code accessible</a>, and we’re thrilled that students in other fields are already beginning to examine the gender breakdown of bylines in their own disciplines. </p>
<p>One thing our data mining study doesn’t reveal is why women are so seriously underrepresented. But a large literature suggests that gender bias almost certainly plays a role. </p>
<h2>Bias in the publishing pipeline</h2>
<p>One place bias occurs is when scientists themselves undervalue the scientific contributions of women. One analysis found that <a href="https://doi.org/10.1097/ACM.0000000000001261">women are more likely to be the person performing experiments</a>. Despite this, they are more likely to be <a href="https://doi.org/10.1371/journal.pone.0066212">in the less prestigious “middle” author position</a>. Anecdotally, many laboratory leaders have observed that male students tend to be more proactive about negotiating their position in the author list than women.</p>
<p>Bias can also influence the reviewing process. Researchers at the Ohio State University found that, when reviewers are randomly assigned to evaluate scientific work ostensibly submitted by a female or a male author, they <a href="https://doi.org/10.1177/1075547012472684">rated the work written by male authors as having higher rigor</a>. An analysis of peer-review scores for postdoctoral fellowship applications in Sweden revealed a system that was “<a href="https://doi.org/10.1038/387341a0">riddled with prejudice</a>” – women were given lower competence ratings than men who had less than half their publication impact. Bias may be particularly strong when expectations are high – <a href="https://doi.org/10.1371/journal.pone.0150194">qualities like “brilliance” are far more likely to be attributed to men</a>. This may be why we found the proportion of women authors was negatively correlated with journal “impact factor.”</p>
<p>Finally, bias occurs within the editorial process.
Nature, in a series of editorials spanning more than a decade, has observed that its editors are <a href="https://doi.org/10.1038/4381078c">less likely to ask</a> <a href="https://doi.org/10.1038/491495a">women to write</a> <a href="https://doi.org/10.1038/541435b">commissioned pieces</a>.</p>
<p>Do women fail to “lean in”? Female authors may be <a href="https://doi.org/10.1038/s41593-017-0052-6">less likely to submit</a> to high-profile journals. Success rates for elite journals are low – for instance, in Nature, less than <a href="https://www.nature.com/nature/for-authors/editorial-criteria-and-processes">10 percent of submissions make it into print</a>. In many fields, the publication delay associated with a failed submission means there’s a high risk of being scooped by another research team. If a female scientist estimates her chance of success more conservatively than a man, for whatever reason, she will be more likely to play it safe.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/209392/original/file-20180307-146697-llpfdv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/209392/original/file-20180307-146697-llpfdv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/209392/original/file-20180307-146697-llpfdv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/209392/original/file-20180307-146697-llpfdv.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/209392/original/file-20180307-146697-llpfdv.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/209392/original/file-20180307-146697-llpfdv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/209392/original/file-20180307-146697-llpfdv.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/209392/original/file-20180307-146697-llpfdv.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Author lists in journals should reflect who is doing science today and not the ‘old, white men’ of yore.</span>
<span class="attribution"><a class="source" href="https://unsplash.com/photos/zeH-ljawHtg">Giammarco Boscaro on Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>Holding journals accountable</h2>
<p>Scientific publishing is staggeringly profitable: In 2017, Elsevier reported <a href="https://www.timeshighereducation.com/news/elseviers-profits-swell-more-ps900-million">profits of over US$1.2 billion</a>. These companies rely heavily on the scientific community, both as authors of the journal content they are selling and as reviewers. Given the profit they make and the outsized influence they wield over scientific careers, it seems obvious that journals have a moral and perhaps even a legal responsibility to make sure the process is equable.</p>
<p>We believe journals need to take full responsibility for ensuring social equity across the publishing pipeline: encouraging women to submit, ensuring that women receive fair reviews, and enforcing equity in the editorial process.</p>
<p>There are some obvious first steps. The scientific community should demand that journals collect data about gender and ethnicity for article submissions and acceptances, and these data should be publicly available. That way researchers can choose to avoid (or even boycott) journals with a poor track record. Researchers should insist that reviewers be given more specific review criteria – such as requirements to explain their ratings of significance and impact, as well as their assessment of scientific quality, as is done at the <a href="https://grants.nih.gov/grants/peer/critiques/rpg.htm">NIH</a> and the <a href="https://www.nsf.gov/pubs/policydocs/pappg17_1/pappg_3.jsp#IIIA">National Science Foundation</a>. Finally, it is past time for journals to adopt mandatory double-blind reviewing.</p>
<p>While the representation of women authors may not have changed over the last decade or so, the attitude of the scientific community has transformed. When I (Ione Fine) was an undergraduate at Oxford, I was told casually by a professor that “women don’t run with the ball intellectually” – even though I was interviewing him for a feminist magazine! (For 20 years, I have wondered whether this reflected extraordinary arrogance combined with a singular lack of tact or sheer idiocy.) But the only thing that made the comment surprising was the context – his attitude was commonplace.</p>
<p>These days there is an overwhelming consensus in our scientific community that scientific talent is not gendered. <a href="https://www.ecu.ac.uk/equality-charters/athena-swan/athena-swan-members/">Universities</a>, <a href="https://www.nsf.gov/pubs/2010/nsf10593/nsf10593.htm">funding agencies</a>, <a href="http://www.sfn.org/nqmfp">conference organizers</a> and individual laboratory leaders around the world are all working to resolve this problem. It is time for the journals to “lean in.”</p><img src="https://counter.theconversation.com/content/92999/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Women are underrepresented in academic science. New research finds the problem is even worse in terms of who authors high-profile journal articles – bad news for women’s career advancement.Ione Fine, Professor of Psychology, University of WashingtonAlicia Shen, Ph.D Candidate in Psychology, University of WashingtonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/840322018-01-03T11:20:41Z2018-01-03T11:20:41ZNovelty in science – real necessity or distracting obsession?<figure><img src="https://images.theconversation.com/files/199939/original/file-20171219-5004-1ecssnn.jpg?ixlib=rb-1.1.0&rect=693%2C5%2C2809%2C1943&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It may take time for a tiny step forward to show its worth.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/man-grey-suit-hold-light-left-541269598">ellissharp/Shutterstock.com</a></span></figcaption></figure><p>In a <a href="https://www.nature.com/news/1-500-scientists-lift-the-lid-on-reproducibility-1.19970">survey of over 1,500 scientists</a>, more than 70 percent of them reported having been unable to reproduce other scientists’ findings at least once. Roughly half of the surveyed scientists ran into problems trying to reproduce their own results. No wonder people are talking about a “<a href="https://theconversation.com/us/topics/reproducibility-5484">reproducibility crisis</a>” in scientific research – an epidemic of studies that <a href="https://thenextregeneration.wordpress.com/2013/07/23/replicability-of-high-impact-papers-in-stem-cell-research/">don’t hold up</a> when <a href="https://thenextregeneration.wordpress.com/2013/10/26/the-replicability-crisis-in-cancer-research/">run a second time</a>.</p>
<p>Reproducibility of findings is a core foundation of science. If scientific results only hold true in some labs but not in others, then how can researchers feel confident about their discoveries? How can society put evidence-based policies into place if the evidence is unreliable?</p>
<p>Recognition of this “crisis” has prompted calls for reform. Researchers are feeling their way, experimenting with different practices meant to help distinguish solid science from irreproducible results. Some people are even starting to reevaluate how choices are made about what research actually gets tackled. Breaking innovative new ground is flashier than revisiting already published research. Does prioritizing novelty naturally lead to this point?</p>
<h2>Incentivizing the wrong thing?</h2>
<p>One solution to the reproducibility crisis could be simply to conduct lots of replication studies. For instance, the <a href="https://elifesciences.org/collections/9b1e83d1/reproducibility-project-cancer-biology">scientific journal eLife</a> is participating in an initiative to validate and reproduce important recent findings in the field of cancer research. The first set of these “rerun” studies was recently released and <a href="http://www.nature.com/news/cancer-reproducibility-project-releases-first-results-1.21304">yielded mixed results</a>. The results of 2 out of 5 research studies were reproducible, one was not and two additional studies did not provide definitive answers.</p>
<p>There’s no need to restrict these sort of rerun studies to cancer research – reproducibility issues can be spotted across <a href="https://theconversation.com/we-found-only-one-third-of-published-psychology-research-is-reliable-now-what-46596">various fields</a> <a href="https://theconversation.com/half-of-biomedical-research-studies-dont-stand-up-to-scrutiny-and-what-we-need-to-do-about-that-45149">of scientific research</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/199940/original/file-20171219-4995-ddcteg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/199940/original/file-20171219-4995-ddcteg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199940/original/file-20171219-4995-ddcteg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199940/original/file-20171219-4995-ddcteg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199940/original/file-20171219-4995-ddcteg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199940/original/file-20171219-4995-ddcteg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199940/original/file-20171219-4995-ddcteg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199940/original/file-20171219-4995-ddcteg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Researchers should be rewarded for carefully shoring up the foundations of the field.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/scientist-working-laboratory-38872966">Alexander Raths/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>But there’s at least one major obstacle to investing time and effort in this endeavor: the quest for novelty. The <a href="https://doi.org/10.3389/fnhum.2013.00291">prestige of an academic journal</a> depends at least partly on how often the research articles it publishes are cited. Thus, research journals often want to publish novel scientific findings which are more likely to be cited, not necessarily the results of newly rerun older research.</p>
<p>A <a href="https://doi.org/10.1016/j.jclinepi.2012.06.009">study of clinical trials published in medical journals</a> found the most prestigious journals prefer publishing studies considered highly novel and not necessarily those that have the most solid numbers backing up the claims. Funding agencies such as the National Institutes of Health ask scientists who review research grant applications to provide an “innovation” score in order to <a href="https://grants.nih.gov/grants/peer/critiques/rpg_D.htm">prioritize funding for the most innovative work</a>. And scientists of course notice these tendencies – one study found the use of positive words like “novel,” “amazing,” “innovative” and “unprecedented” in paper abstracts and titles <a href="https://doi.org/10.1038/nature.2015.19024">increased almost ninefold between 1974 and 2014</a>.</p>
<p>Genetics researcher <a href="http://dbbs.wustl.edu/faculty/Pages/faculty_bio.aspx?SID=5137">Barak Cohen</a> at Washington University in St. Louis <a href="https://doi.org/10.7554/eLife.28699">recently published a commentary</a> analyzing this growing push for novelty. He suggests that progress in science depends on a delicate balance between novelty and checking the work of other scientists. When rewards such as funding of grants or publication in prestigious journals emphasize novelty at the expense of testing previously published results, science risks developing cracks in its foundation.</p>
<h2>Houses of brick, mansions of straw</h2>
<p>Cancer researcher William Kaelin Jr., a recipient of the <a href="http://www.laskerfoundation.org/awards/show/oxygen-sensing-essential-process-survival/">2016 Albert Lasker Award for Basic Medical Research</a>, <a href="http://www.nature.com/news/publish-houses-of-brick-not-mansions-of-straw-1.22029">recently argued</a> for fewer “mansions of straw” and more “houses of brick” in scientific publications.</p>
<p>One of his main concerns is that scientific papers now inflate their claims in order to emphasize their novelty and the relevance of biomedical research for clinical applications. By exchanging depth of research for breadth of claims, researchers may be at risk of compromising the robustness of the work. By claiming excessive novelty and impact, researchers may undermine its actual significance because they may fail to provide solid evidence for each claim. </p>
<p>Kaelin even suggests that some of his <a href="http://www.pnas.org/content/93/20/10595">own work from the 1990s, which transformed cell biology research</a> by discovering how cells can sense oxygen, may have struggled to get published today.</p>
<p>Prestigious journals often now demand complete scientific stories, from basic molecular mechanisms to proving their relevance in various animal models. Unexplained results or unanswered questions are seen as weaknesses. Instead of publishing one exciting novel finding that is robust, and which could spawn a new direction of research conducted by other groups, researchers now spend years gathering a whole string of findings with broad claims about novelty and impact.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/199942/original/file-20171219-4980-14si8bk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/199942/original/file-20171219-4980-14si8bk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/199942/original/file-20171219-4980-14si8bk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=397&fit=crop&dpr=1 600w, https://images.theconversation.com/files/199942/original/file-20171219-4980-14si8bk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=397&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/199942/original/file-20171219-4980-14si8bk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=397&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/199942/original/file-20171219-4980-14si8bk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=499&fit=crop&dpr=1 754w, https://images.theconversation.com/files/199942/original/file-20171219-4980-14si8bk.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=499&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/199942/original/file-20171219-4980-14si8bk.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=499&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">There should be more than one path to a valuable journal publication.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/divergence-paths-forest-crossroads-among-many-681313621">Mehaniq/Shutterstock.com</a></span>
</figcaption>
</figure>
<h2>Balancing fresh findings and robustness</h2>
<p>A challenge for editors and reviewers of scientific manuscripts is assessing the novelty and likely long-term impact of the work they’re assessing. The eventual importance of a new, unique scientific idea is sometimes difficult to recognize even by peers who are grounded in existing knowledge. Many basic research studies form the basis of future practical applications. One recent study found that of basic research articles that received at least one citation, <a href="https://theconversation.com/tracing-the-links-between-basic-research-and-real-world-applications-82198">80 percent were eventually cited by a patent application</a>. But it takes time for practical significance to come to light.</p>
<p>A collaborative team of economics researchers <a href="https://doi.org/10.1016/j.respol.2017.06.006">recently developed an unusual measure of scientific novelty</a> by carefully studying the references of a paper. They ranked a scientific paper as more novel if it cited a diverse combination of journals. For example, a scientific article citing a botany journal, an economics journal and a physics journal would be considered very novel if no other article had cited this combination of varied references before.</p>
<p>This measure of novelty allowed them to identify papers which were more likely to be cited in the long run. But it took roughly four years for these novel papers to start showing their greater impact. One may disagree with this particular indicator of novelty, but the study makes an important point: It takes time to recognize the full impact of novel findings. </p>
<p>Realizing how difficult it is to assess novelty should give funding agencies, journal editors and scientists pause. Progress in science depends on new discoveries and following unexplored paths – but solid, reproducible research requires an equal emphasis on the robustness of the work. By restoring the balance between demands and rewards for novelty and robustness, science will achieve even greater progress.</p><img src="https://counter.theconversation.com/content/84032/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jalees Rehman receives funding from the National Institutes of Health (NIH). </span></em></p>Scientists are rewarded with funding and publications when they come up with innovative findings. But in the midst of a ‘reproducibility crisis,’ being new isn’t the only thing to value about research.Jalees Rehman, Professor of Medicine, Pharmacology and Bioengineering, University of Illinois ChicagoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/883922017-12-12T03:59:42Z2017-12-12T03:59:42ZUniversities spend millions on accessing results of publicly funded research<figure><img src="https://images.theconversation.com/files/198056/original/file-20171207-31552-nb47pq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Research findings are published in peer-reviewed academic journals, many of which charge universities subscription fees. </span> <span class="attribution"><span class="source">from www.shutterstock.com</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>University research is generally funded from the public purse. The results, however, are published in peer-reviewed academic journals, many of which charge subscription fees. </p>
<p>I had to use freedom of information laws to determine how much universities in New Zealand spend on journal subscriptions to give researchers and students access to the latest research - and I found they paid almost US$15 million last year to just four publishers.</p>
<p>There are additional costs, too. Paywalls on research hold up scientific progress and limit the public’s access to the latest information.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/not-just-available-but-also-useful-we-must-keep-pushing-to-improve-open-access-to-research-86058">Not just available, but also useful: we must keep pushing to improve open access to research</a>
</strong>
</em>
</p>
<hr>
<p>The project took more than three years because universities originally refused to release the information. I had to make a complaint to the Ombudsman, the government official charged with determining whether information from the state sector should be publicly disclosed. </p>
<p>The Ombudsman’s <a href="https://doi.org/10.6084/m9.figshare.5673583">final opinion</a> ruled unambiguously that the public’s right to know outweighs any commercial interests of the publishers and universities. </p>
<h2>The cost of knowledge</h2>
<p>The following points stand out in a preliminary analysis of spending by New Zealand universities on subscriptions to journals from Elsevier, Springer, Wiley and Taylor & Francis between 2013 and 2016. </p>
<ul>
<li><p>The total amount spent on the four publishers is substantial, around US$14.9 million in 2016 (the total spending on all publishers is likely at least 2-3 times that). The University of Auckland, with 33000 students and 2200 academic and research staff, spent US$3.8 million, including US$1.6 million on Elsevier.</p></li>
<li><p>The mean expenditure per academic/research staff member in 2016 was around US$1800.</p></li>
<li><p>The University of Canterbury is getting a much worse deal than the others, 35% above the mean.</p></li>
<li><p>The rate of increase of subscription costs (17%) over the period clearly exceeds the Consumer Price Index inflation rate over the period (2-3% in New Zealand, USA and Europe).</p></li>
<li><p>The publisher with the highest percentage increase over the period was Taylor & Francis (33%).</p></li>
</ul>
<iframe src="https://datawrapper.dwcdn.net/JQEnj/1/" scrolling="no" frameborder="0" allowtransparency="true" allowfullscreen="allowfullscreen" webkitallowfullscreen="webkitallowfullscreen" mozallowfullscreen="mozallowfullscreen" oallowfullscreen="oallowfullscreen" msallowfullscreen="msallowfullscreen" width="100%" height="476"></iframe>
<h2>Obtaining the information</h2>
<p>Many journal subscription prices are high (for example, the prominent biology journal Cell is over US$2000 per year), especially given that the funding for the research typically comes from public sources. </p>
<p>With the advent of the internet, many people predicted a major drop in expenditure on journals, but the opposite <a href="http://lj.libraryjournal.com/2017/04/publishing/new-world-same-model-periodicals-price-survey-2017/#_">has occurred</a>. One reason is that the main commercial publishers use anti-competitive practices such as bundling of unrelated journals (so-called “<a href="http://econ.ucsb.edu/%7Etedb/Journals/BundleContracts.html">Big Deals</a>”) and <a href="https://www.youtube.com/watch?v=4JsNT1gKe7I">confidentiality clauses in contracts</a>. </p>
<p>Price secrecy allows sellers to use differential pricing and weaken the negotiating situation of buyers, leading to market inefficiency. The fact that <a href="https://en.wikipedia.org/wiki/Ingelfinger_rule">each journal has a monopoly</a> on its specific content means that journals cannot be easily substituted by others.</p>
<p>In 2014, Timothy Gowers and others used freedom of information laws to <a href="https://gowers.wordpress.com/2014/04/24/elsevier-journals-some-facts/">extract the relevant price information from universities in the UK</a>. In 2009, less extensive <a href="http://www.pnas.org/content/111/26/9425.abstract">work in the USA</a> had also been done by Ted Bergstrom and colleagues. Data from <a href="https://avointiede.fi/web/openscience/publisher_costs">Finland</a> and <a href="http://www.vsnu.nl/en_GB/cost-of-publication">Netherlands</a> has recently been made public. </p>
<p>I requested data from seven of New Zealand’s eight universities, which collectively have around 8400 academic/research staff and 130000 students. The process was long and required persistence. Following the Ombudsman’s ruling, the universities complied, supplying me with data on spending on journals from Elsevier, Springer, Wiley and Taylor & Francis. </p>
<iframe src="https://datawrapper.dwcdn.net/Wd6x2/4/" scrolling="no" frameborder="0" allowtransparency="true" allowfullscreen="allowfullscreen" webkitallowfullscreen="webkitallowfullscreen" mozallowfullscreen="mozallowfullscreen" oallowfullscreen="oallowfullscreen" msallowfullscreen="msallowfullscreen" width="100%" height="573"></iframe>
<p>There are some subtleties, such as assumptions about exchange rate conversions and exactly which products from the listed publishers the money is spent on. Interested readers can consult the <a href="https://doi.org/10.6084/m9.figshare.5656054">raw data</a>.</p>
<h2>Is open access the answer?</h2>
<p>The restricted access inherent in the subscription model makes it hard for journalists, politicians and the general public to use scholarship for better evidence-based decision making.</p>
<p>Recently, open access journals have emerged. They place no barriers on readers but still have production costs. The <a href="https://en.wikipedia.org/wiki/Open_access#Journals:_gold_open_access">“Gold Open Access” model</a>, in which authors or funders typically pay for each article, has become popular with traditional publishers. They often set the article processing charge level at around US$2000 to US$3000. </p>
<p>The analysis above implies that wholesale conversion to such article processing charges will not save money for universities. <a href="http://bjoern.brembs.net/2017/11/is-a-cost-neutral-transition-to-open-access-realistic/">Several independent estimates</a> put a reasonable article processing charge at no more than US$500 (less in some disciplines).</p>
<p>The key problem is not the particular model of payment for journal article production and distribution, but the dysfunctional market in publishing services. Although they are a large part of the problem, commercial publishers are not entirely to blame. For example, the research community uses historical journal reputation to evaluate researchers, making it harder for new, better run journals to enter the market. </p>
<h2>The right kind of open access</h2>
<p>Even with the best will in the world, there is an inevitable time lag for new journals to become established. To make faster progress, it is necessary to decouple the ownership of current journal titles from the provision of editorial and publication services, so that competition among publishers helps to control prices. This reclaiming of community control is the most fundamental of the recently formalised <a href="http://fairoa.org">Fair Open Access Principles</a>. </p>
<p>New organisations such as <a href="http://mathoa.org/">MathOA</a>, <a href="http://psyoa.org">PsyOA</a>, <a href="http://lingoa.eu">LingOA</a> and the <a href="http://fairoa.org">Fair Open Access Alliance</a> have been set up to facilitate large-scale conversion of subscription journals to an open access model, with community control of journals and no direct author payments. This of course involves mass defections by editorial boards.</p>
<p>We expect that global savings of at least 75% of current payments to journals can be made by using modern publishing providers such as <a href="https://scholasticahq.com/">Scholastica</a> and <a href="https://www.ubiquitypress.com/">Ubiquity</a>, and by reallocating subscription payments toward article processing charges. What is the research community waiting for?</p><img src="https://counter.theconversation.com/content/88392/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mark C. Wilson is a board member for MathOA, and its delegate to the Fair Open Access Alliance. Both of these are nonprofit organizations registered in the Netherlands.</span></em></p>Universities in New Zealand spent close to US$15 million on subscriptions to just four publishers in 2016, data that was only released following a request to the Ombudsman.Mark C. Wilson, Senior Lecturer, Department of Computer Science, University of Auckland, Waipapa Taumata RauLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/808692017-11-06T01:22:00Z2017-11-06T01:22:00ZAcademic journal publishing is headed for a day of reckoning<figure><img src="https://images.theconversation.com/files/193246/original/file-20171103-1041-1mv0m1i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Locking articles away behind a paywall stifles access.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/brixton/318141026">Elizabeth</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span></figcaption></figure><p>Imagine a researcher working under deadline on a funding proposal for a new project. This is the day she’s dedicated to literature review – pulling examples from existing research in published journals to provide evidence for her great idea. Creating an up-to-date picture of where things stand in this narrow corner of her field involves 30 references, but she has access to only 27 of those via her library’s journal subscriptions. Now what?</p>
<p>There isn’t time to contact the three primary authors to get copies directly from them. Interlibrary loan will take too long. She could try other sites that host academic papers – such as ResearchGate and Sci-Hub – but access to particular articles isn’t assured and publishers are cracking down on what they call copyright violations.</p>
<p>This fictitious example illustrates the quandary in which many researchers find themselves today. Access to journals is crucial for how they do their work. But few research libraries can afford all the journal subscriptions needed by all of their faculty for all occasions. As the dean of libraries at a state school, I contend that the economic model for academic journal publications is broken. As scholars are handicapped by limited access to the corpus of research in their fields, scientific progress is restricted and slows, and society ultimately loses.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/193247/original/file-20171103-1032-tt9l0h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/193247/original/file-20171103-1032-tt9l0h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/193247/original/file-20171103-1032-tt9l0h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193247/original/file-20171103-1032-tt9l0h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193247/original/file-20171103-1032-tt9l0h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193247/original/file-20171103-1032-tt9l0h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193247/original/file-20171103-1032-tt9l0h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193247/original/file-20171103-1032-tt9l0h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A literature review depends on access to online or hard copies of published research articles.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/wolflawlibrary/3923735590">The Wolf Law Library</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<h2>Path to publication</h2>
<p>Before there’s anything to be published in a journal, a researcher must conduct the study or experiment. The first step is the proposal to secure necessary funding. As in our vignette above, this requires establishing the state of the art from a literature review of professional publications in the discipline – accessed mostly through subscription contracts entered into by their institution’s library.</p>
<p>Volunteer committees of researchers assembled by federal funding agencies, such as the National Science Foundation or National Institute of Health, review these proposals. Grants are awarded to the ones considered most promising. Then the funded researchers get down to work.</p>
<p>After conducting the research, sometimes over multiple years, the next step for a scholar is to prepare and submit a manuscript to a journal. This publication of new findings becomes the “coin of the realm” in academia. Publishing in top-tier journals provides broad exposure for the ideas, and paves the way to tenure and promotion.</p>
<p>Journal editors, many of whom are themselves researchers and uncompensated for these duties, manage the review and acceptance process. Editors distribute manuscripts to anonymous reviewers who are experts, typically researchers in the field, who assess whether the work is solid and advances the state of knowledge. The reviewers can recommend the journal accept the manuscript as is (almost never happens), accept with mandatory or suggested revisions, or reject for publication.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/193235/original/file-20171103-1046-18lg81i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/193235/original/file-20171103-1046-18lg81i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/193235/original/file-20171103-1046-18lg81i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=503&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193235/original/file-20171103-1046-18lg81i.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=503&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193235/original/file-20171103-1046-18lg81i.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=503&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193235/original/file-20171103-1046-18lg81i.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=632&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193235/original/file-20171103-1046-18lg81i.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=632&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193235/original/file-20171103-1046-18lg81i.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=632&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Example of an article hidden behind an online paywall, accessible only to subscribers.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/dullhunk/5471810850">Duncan Hull</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Now it’s up to the original researcher to make changes and submit the final article for publication. Often authors must remunerate the publisher for processing charges, typically in the range of US$2,000 to $5,000 per article. For this fee, the journals provide copy editing and other publishing functions, and produce galley proofs for the authors to review and vet.</p>
<p>Finally, the article is published, where it’s most likely hidden behind a “paywall” on a site that can be accessed only by paying subscribers, typically academic institutions’ libraries.</p>
<p>Note that most of the heavy lifting is accomplished by the researchers themselves. </p>
<h2>Ballooning costs make it impossible to keep up</h2>
<p>So, in our institutions of higher education and our research labs, scholars first produce, then buy back, their own content.</p>
<p>For this privilege, thousands of institutions pay billions of dollars per year to publishers. Members of the <a href="http://www.arl.org/">Association of Research Libraries</a> alone report spending about $1 billion per year <a href="http://www.arl.org/focus-areas/statistics-assessment/statistical-trends">purchasing subscriptions to journals</a>. </p>
<p>According to an internal Association of Research Libraries survey conducted in 2016, the vast majority of their member libraries can’t keep up with these costs. For the most part, their budgets aren’t keeping up with publisher cost increases, so they need to make hard decisions about which journal subscriptions to let lapse, which new and emerging areas to ignore, and where they can cut their nonjournal collections to find additional dollars for journal cost increases.</p>
<p>And the price tags are rising, with <a href="http://www.arl.org/focus-areas/statistics-assessment/statistical-trends">journal inflation costs outpacing the consumer price index</a> by a factor of four to five.</p>
<p><iframe id="qpFCw" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/qpFCw/2/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>The 123 ARL institutions are among the best, most well-funded institutions in North America. Things may be far worse for other institutions and their libraries. And the situation may be most extreme for institutions and researchers in developing countries, who can afford few if any paywall subscriptions to journals – at a time when an increasing amount of research is conducted collaboratively across borders and is relevant to their countries.</p>
<p>Academia’s original impetus for publishing through the private sector was to ensure a sufficient economic base to perform the copy editing and publication. Now, some of the larger publishers report <a href="https://en.wikipedia.org/wiki/Elsevier">billions of dollars of profit annually</a>, exceeding 30 percent of revenue – never the intent.</p>
<p>For their part, the journals say they’re providing <a href="http://science.sciencemag.org/content/352/6285/497.full">valuable services that have real costs</a> – things such as expertly curating, editing and proofreading the content. But critics claim publishers are more interested in profit than disseminating scholarship. It doesn’t seem they’re passing on any cost savings that have presumably resulted from tech advances – things such as accepting electronic copies of papers that make it easier to produce final versions, or doing away with printing expensive hard copies and exclusively publishing online. </p>
<p>This upside-down publishing picture has persisted since the 1980s. Due to institutions’ reliance upon library materials, the inability to keep up with such extreme cost increases is damaging to higher education instruction and research. Without consistent access to the cutting-edge knowledge that’s embodied in the universe of journal publications, faculty, students and researchers can’t keep up with new research.</p>
<h2>Short-circuiting the system</h2>
<p>Maybe it’s not a surprise that individuals find ways to circumvent the publishers’ paywalls to access content, despite the scrupulous adherence to copyright law <a href="http://www.arl.org/focus-areas/copyright-ip">espoused by libraries and librarians</a>.</p>
<p>Access via ResearchGate – a networking site where researchers can share papers – doesn’t provide journal articles’ full text. And it’s dealing with the threat of <a href="http://www.nature.com/news/publishers-threaten-to-remove-millions-of-papers-from-researchgate-1.22793">lawsuits from publishers</a> who say the site violates their copyrights.</p>
<p>Sci-Hub provides access to tens of millions of papers, letting people sneak around paywalls. Again, the publishers claim copyright violation. The site, hosted in Russia, has faced injunctions and lawsuits, and even been ordered by a New York court to pay publisher Elsevier <a href="https://www.nature.com/news/us-court-grants-elsevier-millions-in-damages-from-sci-hub-1.22196">$15 million in damages</a>.</p>
<p>But efforts like these to bypass paywalls are only symptoms of the problem. Unsustainability embedded in the current economic model for journal publications is the source. If we are to maintain healthy education and research environments, changes are incipient and imperative. </p>
<p>Possible solutions include taking collective action with publishers to obtain lower pricing immediately, with reasonable annual inflation, and better bundling of titles so libraries get the ones we want rather than the ones publishers add. Open access journals that don’t employ paywalls could help. Another partial solution could be capturing many more preprint versions of journal articles in a formal, citable fashion that can be referenced well. The state of academic publishing is in such crisis that a variety of strategies may need to be adopted.</p><img src="https://counter.theconversation.com/content/80869/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Patrick Burns does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>In our institutions of higher education and our research labs, scholars first produce, then buy back, their own content. With the costs rising and access restricted, something’s got to give.Patrick Burns, Dean of Libraries and Vice President for Information Technology, Colorado State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/768512017-05-30T01:49:32Z2017-05-30T01:49:32ZResearch transparency: 5 questions about open science answered<figure><img src="https://images.theconversation.com/files/171204/original/file-20170526-6389-1eepgnq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Opening up data and materials helps with research transparency.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/book-wisdom-life-read-magic-background-515241850">REDPIXEL.PL via Shutterstock.com</a></span></figcaption></figure><p><strong>What is “open science”?</strong></p>
<p><a href="https://osf.io/preprints/psyarxiv/ak6jr">Open science</a> is a set of practices designed to make scientific processes and results more transparent and accessible to people outside the research team. It includes making complete research materials, data and lab procedures freely available online to anyone. Many scientists are also proponents of <a href="https://sparcopen.org/open-access/">open access</a>, a parallel movement involving making research articles available to read without a subscription or access fee.</p>
<p><strong>Why are researchers interested in open science? What problems does it aim to address?</strong></p>
<p>Recent research finds that many published scientific findings might not be reliable. For example, researchers have reported being able to replicate <a href="https://elife.elifesciences.org/collections/reproducibility-project-cancer-biology">only 40 percent</a> <a href="https://doi.org/10.1038/nrd3439-c1">or less</a> of <a href="http://www.nature.com/nature/journal/v483/n7391/full/483531a.html">cancer biology results</a>, and a large-scale <a href="https://doi.org/10.1126/science.aac4716">attempt to replicate 100 recent psychology studies</a> successfully reproduced fewer than half of the original results.</p>
<p>This has come to be called a “<a href="https://theconversation.com/we-found-only-one-third-of-published-psychology-research-is-reliable-now-what-46596">reproducibility crisis</a>.” It’s pushed many scientists to look for ways to improve their research practices and increase study reliability. Practicing open science is one way to do so. When scientists share their underlying materials and data, other scientists can more easily evaluate and attempt to replicate them.</p>
<p>Also, open science can help speed scientific discovery. When scientists share their materials and data, others can use and analyze them in new ways, potentially leading to new discoveries. Some journals are specifically dedicated to publishing data sets for reuse (<a href="https://www.nature.com/sdata/">Scientific Data</a>; <a href="http://openpsychologydata.metajnl.com/">Journal of Open Psychology Data</a>). <a href="http://doi.org/10.5334/jopd.ac">A paper in the latter</a> has already been cited 17 times in under three years – nearly all these citations represent new discoveries, sometimes on topics unrelated to the original research.</p>
<p><strong>Wait – open science sounds just like the way I learned in school that science works. How can this be new?</strong></p>
<p>Under the status quo, science is shared through a single vehicle: Researchers publish journal articles summarizing their studies’ methods and results. The key word here is summary; to write a clear and succinct article, important details may be omitted. Journal articles are vetted via the peer review process, in which an editor and a few experts assess them for quality before publication. But – perhaps surprisingly – the primary data and materials underlying the article are almost never reviewed. </p>
<p>Historically, this made some sense because journal pages were limited, and storing and sharing materials and data were difficult. But with computers and the internet, it’s much easier to practice open science. It’s now feasible to store large quantities of information on personal computers, and <a href="https://www.nature.com/sdata/policies/repositories">online repositories to share study materials and data</a> are becoming more common. Recently, some journals have even begun to <a href="http://journals.plos.org/plosone/s/data-availability">require</a> or <a href="https://osf.io/tvyxz/wiki/5.%20Adoptions%20and%20Endorsements/">reward</a> <a href="http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1002456">open science practices</a> like publicly posting materials and data.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/171205/original/file-20170526-6402-1kb6dxp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/171205/original/file-20170526-6402-1kb6dxp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/171205/original/file-20170526-6402-1kb6dxp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=397&fit=crop&dpr=1 600w, https://images.theconversation.com/files/171205/original/file-20170526-6402-1kb6dxp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=397&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/171205/original/file-20170526-6402-1kb6dxp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=397&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/171205/original/file-20170526-6402-1kb6dxp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=499&fit=crop&dpr=1 754w, https://images.theconversation.com/files/171205/original/file-20170526-6402-1kb6dxp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=499&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/171205/original/file-20170526-6402-1kb6dxp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=499&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Open science makes sharing data the default.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/client-passing-documentation-binders-his-partner-330663044">Bacho via Shutterstock.com</a></span>
</figcaption>
</figure>
<p>There are still some difficulties sharing extremely large data sets and physical materials (such as the specific liquid solutions a chemist might use), and some scientists might have good reasons to keep some information private (for instance, trade secrets or study participants’ personal information). But as time passes, more and more scientists will likely practice open science. And, in turn, science will improve.</p>
<p>Some do view the open science movement as a return to science’s core values. Most researchers over time have <a href="https://doi.org/10.1525/jer.2007.2.4.3">valued transparency</a> as a key ingredient in evaluating the truth of a claim. Now with technology’s help it is much easier to share everything.</p>
<p><strong>Why isn’t open science the default? What incentives work against open science practices?</strong></p>
<p>Two major forces work against adoption of open science practices: habits and reward structures. First, most established researchers have been practicing closed science for years, even decades, and changing these old habits requires some upfront time and effort. <a href="https://osf.io">Technology</a> is helping speed this process of adopting open habits, but behavioral change is hard. </p>
<p>Second, scientists, like other humans, tend to repeat behaviors that are rewarded and avoid those that are punished. Journal editors have tended to favor publishing papers that tell a tidy story with perfectly clear results. This has led researchers to craft their papers to be free from blemish, omitting “failed” studies that don’t clearly support their theories. But real data are often messy, so being fully transparent can open up researchers to critique. </p>
<p>Additionally, some researchers are afraid of being “scooped” – they worry someone will steal their idea and publish first. Or they fear that others will <a href="http://www.nejm.org/doi/full/10.1056/NEJMe1516564">unfairly benefit</a> from using shared data or materials without putting in as much effort. </p>
<p>Taken together, some researchers worry they will be punished for their openness and are skeptical that the perceived increase in workload that comes with adopting open science habits is needed and worthwhile. We believe scientists must continue to <a href="https://osf.io/tvyxz/">develop systems</a> to <a href="http://www.ourdigitalmags.com/publication/?i=365522&article_id=2657445&view=articleBrowser&ver=html5#%7B%22issue_id%22:365522,%22view%22:%22articleBrowser%22,%22article_id%22:%222657445%22%7D">allay fears</a> and reward openness. </p>
<p><strong>I’m not a scientist; why should I care?</strong></p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/171145/original/file-20170526-6380-6rryx7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/171145/original/file-20170526-6380-6rryx7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/171145/original/file-20170526-6380-6rryx7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=466&fit=crop&dpr=1 600w, https://images.theconversation.com/files/171145/original/file-20170526-6380-6rryx7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=466&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/171145/original/file-20170526-6380-6rryx7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=466&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/171145/original/file-20170526-6380-6rryx7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=585&fit=crop&dpr=1 754w, https://images.theconversation.com/files/171145/original/file-20170526-6380-6rryx7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=585&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/171145/original/file-20170526-6380-6rryx7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=585&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Open access is the cousin to open science – the idea is that research should be freely available to all, not hidden behind paywalls.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/34070876@N08/3602393341">h_pampel</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Science benefits everyone. If you’re reading this article now on a computer, or have ever benefited from an antibiotic, or kicked a bad habit following a psychologist’s advice, then you are a consumer of science. Open science (and its cousin, open access) means that anyone – including teachers, policymakers, journalists and other nonscientists – can access and evaluate study information.</p>
<p>Considering automatic enrollment in a 401k at work or whether to have that elective screening procedure at the doctor? Want to ensure your tax dollars are spent on policies and programs that actually work? Access to high-quality research evidence matters to you. Open materials and open data facilitate reuse of scientific products, increasing the value of every tax dollar invested. Improving science’s reliability and speed benefits us all.</p><img src="https://counter.theconversation.com/content/76851/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Elizabeth Gilbert supports the Society for the Improvement of Psychological Science and has published on replication efforts as part of the Open Science Collaboration. Along with Katherine Corker and Barbara Spellman, she has a chapter called "Open Science: What, why, how" forthcoming in the Stevens Handbook of Experimental Psychology and Cognitive Neuroscience.</span></em></p><p class="fine-print"><em><span>Katie Corker is on the executive board for the Society for the Improvement of Psychological Science (improvingpsych.org) and an ambassador for the Center for Open Science (cos.io). She is also an editorial board member for Scientific Data. All of these roles are pro bono.</span></em></p>Partly in response to the so-called ‘reproducibility crisis’ in science, researchers are embracing a set of practices that aim to make the whole endeavor more transparent, more reliable – and better.Elizabeth Gilbert, Postdoctoral Research Fellow in Psychiatry and Behavioral Sciences, Medical University of South CarolinaKatie Corker, Assistant Professor of Psychology, Grand Valley State University Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/726692017-02-26T16:59:05Z2017-02-26T16:59:05ZThe peer-review system for academic papers is badly in need of repair<figure><img src="https://images.theconversation.com/files/156762/original/image-20170214-25992-15ckbwa.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The scientific refereeing process can be tedious, time-consuming and isn't very rewarding.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Peer review, or scientific refereeing, is the basis of the academic process. It’s a rigorous evaluation that aims to ensure only work which advances knowledge is published in a scientific journal. Scientists must be able to trust this system: if they see that something is peer reviewed, it should be a hallmark of quality.</p>
<p>When the editor of scientific journal receives a manuscript, they ask other another scientist – a specialist in their field – to review it. The referee is required to advise the editor whether the manuscript should be published and to give <a href="https://theconversation.com/how-plugging-into-well-connected-colleagues-can-help-research-fly-71223">feedback</a> to the authors.</p>
<p>The system is not flawless. There have been instances of <a href="http://blogs.lse.ac.uk/impactofsocialsciences/2016/12/13/manipulating-the-peer-review-process-why-it-happens-and-how-it-might-be-prevented/">fraud and manipulation</a> due to refereeing, but these are – we hope – isolated cases. </p>
<p>But there are much bigger systemic problems associated with peer review. These are negatively affecting scientific credibility. These include the fact that, globally, it is hard to find referees: reviewing a manuscript requires a lot of time and minimal reward. Very few journals pay referees, and most academics who act as referees are doing so for free in their spare time.</p>
<p>On top of this those who do act as referees often struggle to deliver on time. Worse still, their reports are not always helpful to editors or authors. </p>
<p>Some journals work actively to tackle these issues, but more can be done to ensure that the scientific refereeing system retains its integrity.</p>
<h2>The challenges</h2>
<p>Journal editors are frustrated about the dearth of referees. In an <a href="https://hub.wiley.com/community/exchanges/discover/blog/2015/01/07/recognition-for-peer-review-and-editing-in-australia-and-beyond">open letter</a> to the scientific community, a group of editors wrote that, despite:</p>
<blockquote>
<p>… so much weight [being] given to peer-reviewed publication the essential “backroom” tasks of editing journals and reviewing articles are rarely acknowledged as aspects of academic performance.</p>
</blockquote>
<p>No wonder they’re worried: more than <a href="http://www.informationr.net/ir/14-1/paper391.html">1 million research articles</a> are published globally each year. That requires a lot of referees. But finding appropriate referees is just one part of the bigger task facing editors.</p>
<p>Editors have to get referees to stick to the agreed deadlines. That’s not easy: people tend not to prioritise their review tasks since time spent on their own research is more rewarding.</p>
<p>An experiment conducted with the Journal of Public Economics based in Cambridge in the US found that its referees are late with their reports <a href="http://pubs.aeaweb.org/doi/10.1257/jep.28.3.169">half of the time</a>. There are also instances, across journals, of referees simply never delivering even though they’ve promised to do so.</p>
<p>In some disciplines, these problems have given rise to a serious publication lag – the time between when the manuscript arrives to the actual publication. Over the past 30 years this lag has nearly <a href="http://www.journals.uchicago.edu/doi/10.1086/341868">tripled</a> in economics, from 11 months to just under 30 months. </p>
<p>It not only takes longer to disseminate ideas. The publication lag also worsens the prospects of <a href="http://voxeu.org/article/publication-lags-and-young-economists-research-output">young scientists</a> who need publications to be hired.</p>
<p>Another problem with the existing system is that referee reports do not always adequately inform the editor nor really suggest ways of fundamentally improving the article.</p>
<p>It’s not just authors who complain about this: <a href="http://www.acrwebsite.org/search/view-conference-proceedings.aspx?Id=8104">journal editors</a> do too. One explanation is that referees may follow their own interests, which are not necessarily those of the editor nor the author.</p>
<p>All too often they try to impress editors by making blemishes look like flaws. Economists call this problem “<a href="https://academic.oup.com/rfs/article/28/3/637/1577216/Editorial-Cosmetic-Surgery-in-the-Academic-Review">signal jamming</a>”. At worst it may turn down innovative research.</p>
<h2>Possible changes</h2>
<p>The good news is that journals are aware of these problems, and are committed to tackling them.</p>
<p>Journals should develop and nurture a large base of potential referees, constantly adding new ones and retaining old ones. And these referees need proper recognition. This could involve simply thanking referees publicly, or perhaps awarding prizes for good refereeing.</p>
<p>Journals should also consider paying referees. The estimated value of unpaid referee time is as much as <a href="https://www.timeshighereducation.com/news/unpaid-peer-review-is-worth-19bn/402189.article">£1.9 billion a year</a> – it is clearly a service that requires some financial reward.</p>
<p>Small changes help, too. <a href="http://voxeu.org/article/lessons-experiment-referees-journal-public-economics">Shorter deadlines</a> reduce turnaround time work referees often just submit before the deadline. A public list of referees’ turnaround encourages them to stay on time, too.</p>
<p>Editors should also <a href="http://rfssfs.org/files/2015/01/Joint-Editorial-Advice-for-Authors-2002.pdf">reject</a> <a href="https://academic.oup.com/rfs/article/26/11/2685/1613905/Joint-Editorial">articles</a> that are too sloppy, rather than letting a referee improve them.</p>
<p>Editors should also engage in “<a href="https://academic.oup.com/rfs/article/28/3/637/1577216/Editorial-Cosmetic-Surgery-in-the-Academic-Review">active editing</a>”, instructing the author to ignore referee requests that are merely asking them to fix blemishes.</p>
<p>Editors should also <a href="https://academic.oup.com/rfs/article/25/5/1331/1569914/Reviewing-Less-Progressing-More">pare down</a> the demands on referees, perhaps by asking them to <a href="http://pubs.aeaweb.org/doi/10.1257/jep.31.1.231">separate</a> necessities from suggestions. The guiding principle should be that the work is the author’s – not the referee’s.</p>
<h2>New approaches are being tested</h2>
<p>Journals are already testing new approaches. For instance, some require their editors to <a href="http://revfin.org/new-referee-awards-and-referee-database/">judge the quality</a> of a referee to weed out those people who are simply unhelpful. </p>
<p>Elsevier, a major publisher, has launched a <a href="https://www.reviewerrecognition.elsevier.com/">platform</a> which publicly lists referees and how often they have written referee reports. A similar, independent platform is <a href="https://publons.com/home/">Publons</a>.</p>
<p>“Open peer review” is also growing in popularity. Traditionally, reviewers remain anonymous to guarantee an unbiased opinion. Open peer review goes the opposite way: the referee’s name and report are published together with the article. Everyone can see who the referee was, which is meant to encourage transparency. Not everyone is <a href="http://www.nature.com/nature/peerreview/debate/">convinced</a> about this approach.</p>
<p>Another option is post-publication peer review, in which articles are open for comments all the time from anyone. Sadly, <a href="http://blogs.lse.ac.uk/impactofsocialsciences/2014/11/07/controversy-of-post-publication-peer-review/">internet trolls</a> have tainted this process for many scientists.</p>
<p>It is encouraging that the problems of peer review are being debated and that new approaches are being tested. The peer-review process is very important and its challenges must be taken seriously if academics are to keep publishing quality articles that disseminate new ideas.</p><img src="https://counter.theconversation.com/content/72669/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>There are major systemic problems associated with peer review that are negatively affecting scientific credibility.Michael E. Rose, PhD Candidate in Economics, University of Cape TownWillem H. Boshoff, Associate Professor of Economics, Stellenbosch UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/716132017-01-20T03:34:59Z2017-01-20T03:34:59ZWho will keep predatory science journals at bay now that Jeffrey Beall’s blog is gone?<figure><img src="https://images.theconversation.com/files/153532/original/image-20170119-26563-1bw4put.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The number of predatory scientific journals has exploded in recent years.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>For aficionados of bad science, the <a href="http://web.archive.org/web/20161130225402/https://scholarlyoa.com/">blog</a> of University of Colorado librarian <a href="https://en.wikipedia.org/wiki/Jeffrey_Beall">Jeffrey Beall</a> was essential reading. Beall’s blog charted the murky world of predatory and vanity academic publishers, many of which charge excessive fees for publishing papers or have dysfunctional peer review processes.</p>
<p>I’ve seen rubbish on <a href="http://web.archive.org/web/20161220104931/https://scholarlyoa.com/2016/07/14/more-fringe-science-from-borderline-publisher-frontiers/">chemtrails</a>, <a href="http://web.archive.org/web/20161220062215/https://scholarlyoa.com/2016/02/04/fringe-scientist-named-editor-in-chief-of-omics-astrobiology-journal/">alien life</a>, <a href="http://web.archive.org/web/20161207062935/https://scholarlyoa.com/2013/07/16/recognizing-a-pattern-of-problems-in-pattern-recognition-in-physics/">climate</a>, <a href="http://web.archive.org/web/20161108155658/https://scholarlyoa.com/2014/12/16/the-chinese-publisher-scirp-scientific-research-publishing-a-publishing-empire-built-on-junk-science/">HIV-AIDS</a> and <a href="http://web.archive.org/web/20150905084939/http://scholarlyoa.com/2013/07/20/omics-journal-publishes-pseudo-science-vaccine-paper/">vaccines</a> appear in these (unintended) parodies of academic publications. Although, to be honest, they can be a guilty pleasure of sorts. Perhaps I’m like a film buff getting a kick out of Ed Wood’s “<a href="http://www.imdb.com/title/tt0052077/">Plan 9 from Outer space</a>”.</p>
<p>But recently all of the content on Beall’s blog was <a href="http://www.sciencemag.org/news/2017/01/mystery-controversial-list-predatory-publishers-disappears">wiped without any warning</a>. While much of Beall’s blog is <a href="https://web.archive.org/web/20170112125427/https://scholarlyoa.com/publishers/">archived</a>, it had been charting the evolution of predatory academic publishing, including conferences and the purchasing of existing journals. With Beall’s blog gone, it will become harder to keep track of the underbelly of academic publishing.</p>
<h2>Changing face of scientific publishing</h2>
<p>Traditionally, academic journals have been sustained via subscriptions, particularly those charged to academic libraries. Libraries would pick and choose which journals to subscribe to, in large part based on the requests of academics. </p>
<p>Subscriptions provided some incentive to maintain quality but also limited the readership of academic papers, effectively excluding the broader pubic (whose taxes often funded the research).</p>
<p>As the internet enabled the easy sharing of information, this is now extending to academic publications too. The “<a href="https://theconversation.com/au/topics/open-access-1060">open access</a>” model is increasingly popular, where authors are charged publication fees and the resulting papers are freely available online.</p>
<p>In principle, I like open access, as I believe science should be disseminated to the broadest audience possible. But there are perverse incentives. Will a publisher reject a manuscript that is manifestly rubbish, and forego the fees it would charge the author? In some cases the answer is “no”.</p>
<p>Furthermore, the shift from printed journals to online publications has facilitated predatory and vanity academic publishers. Computers and websites have replaced printing presses and bound volumes. One publisher on Beall’s list, Zant World Press, is run from a <a href="https://zantworldpress.com/about-us/">Melbourne suburban house</a>.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/153530/original/image-20170119-26585-1a18f10.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/153530/original/image-20170119-26585-1a18f10.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/153530/original/image-20170119-26585-1a18f10.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=498&fit=crop&dpr=1 600w, https://images.theconversation.com/files/153530/original/image-20170119-26585-1a18f10.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=498&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/153530/original/image-20170119-26585-1a18f10.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=498&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/153530/original/image-20170119-26585-1a18f10.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=625&fit=crop&dpr=1 754w, https://images.theconversation.com/files/153530/original/image-20170119-26585-1a18f10.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=625&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/153530/original/image-20170119-26585-1a18f10.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=625&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An archive of Beall’s site maintains the most recent list of suspect journals.</span>
</figcaption>
</figure>
<p>Beall’s blog charted the explosion of predatory publishers exploiting the open access model. His list grew from just 18 publishers in 2011 to <a href="http://web.archive.org/web/20170111172023/https://scholarlyoa.com/2017/01/03/bealls-list-of-predatory-publishers-2017/">1,155 publishers in 2017</a>!</p>
<p>I, along with many others, found Beall’s list an incredibly useful resource. Suspicious scientific claims could often be traced back to journals associated with publishers on the list.</p>
<p>For example, in 2015 many newspapers printed claims that chocolate helped weight loss, but it <a href="https://theconversation.com/trolling-our-confirmation-bias-one-bite-and-were-easily-sucked-in-42621">was all a hoax</a>, which included publishing a paper in the <a href="http://www.intarchmed.com/">International Archives of Medicine</a>, which was on <a href="https://archive.fo/9MAAD">Beall’s list</a>.</p>
<p>I recently became aware of another prank, played at the expense of a predatory publisher. Astronomer <a href="http://www.isdc.unige.ch/%7Edeckert/newsite/Dominique_Eckerts_Homepage.html">Dominique Eckert</a> submitted the joke paper “Get me off Your Fucking Mailing List” to <a href="http://www.iosrjournals.org/">IOSR journals</a>. The paper consists of “<a href="http://www.scs.stanford.edu/%7Edm/home/papers/remove.pdf">get me off your fucking mailing list</a>” repeated hundreds of times.</p>
<p>While one cannot fault the paper for clarity of expression, it isn’t suitable for an academic journal. But less than a week after Eckert submitted the paper, it was accepted for publication. The “reviewers comments” were “quality of manuscript is good”. Manuscript handling charges were US$75 (A$100).</p>
<p>Remarkably, this isn’t the first time a predatory publisher has accepted “Get me off Your Fucking Mailing List”. <a href="http://www.slate.com/blogs/browbeat/2014/11/24/bogus_academic_journal_accepts_paper_that_reads_get_me_off_your_fucking.html">Peter Vamplew</a> played the same prank in 2014.</p>
<p>Beall planned a post on Eckert’s prank for Thursday January 12, 2017, but it never happened. By then, all the content was wiped from the blog.</p>
<p>Why this happened isn’t yet clear. The University of Colorado says it was Beall’s <a href="http://www.sciencemag.org/news/2017/01/mystery-controversial-list-predatory-publishers-disappears">personal decision</a>. However, <a href="https://www.sspnet.org/careers/professional-profiles/lacey-earle/">Lacey Earle</a>, who has been working with Beall, tweeted that Beall “was forced to shut down blog due to threats and politics”.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"821387788960591872"}"></div></p>
<p>Certainly there are many publishers and individuals who are no fans of Beall, and legal threats have been made in the past. Without a doubt, the blog has hurt some publishers’ reputations and bottom lines. </p>
<p>Indeed, Beall’s work certainly facilitated the US Federal Trade Commission charging <a href="https://www.omicsonline.org/">OMICS Group</a> with <a href="https://www.ftc.gov/news-events/press-releases/2016/08/ftc-charges-academic-journal-publisher-omics-group-deceived">deceptive acts or practices</a> in August 2016. <a href="https://www.wired.com/2016/09/ftc-cracking-predatory-science-journals/">OMICS has responded</a> and described the allegations as “baseless”.</p>
<h2>Changing times</h2>
<p>A few years ago, predatory publishing often consisted of websites with stock images and poor grammar. Sometimes journal “editors” were revealed to be <a href="http://web.archive.org/web/20170113050537/https://scholarlyoa.com/2015/07/07/predatory-journal-lists-murdered-doctor-as-its-editor-in-chief/">identities stolen off the web</a>. </p>
<p>But, increasingly, predatory publishers are running academic conferences in countries around the globe, including the <a href="http://web.archive.org/web/20161127023353/http://www.conferenceseries.com/usa-meetings/">US</a> and <a href="http://web.archive.org/web/20170115115454/http://www.conferenceseries.com/australia-meetings">Australia</a>. Often the conferences do not live up to their hype, as Radio National’s Hagar Cohen found when <a href="http://www.abc.net.au/radionational/programs/backgroundbriefing/2015-08-02/6656116">she attended</a> an OMICS conference in Brisbane in 2015.</p>
<p>Predatory publishers are also <a href="http://web.archive.org/web/20170110160651/https://scholarlyoa.com/2016/09/29/scam-publisher-omics-international-buying-legitimate-journals/">buying existing journals</a> in developed countries. Recently the <a href="http://www.amj.net.au/index.php?journal=AMJ">Australasian Medical Journal</a>’s contact details shifted from Melbourne to London, and it now shares the postal address of <a href="https://www.imedpub.com/">iMedPub</a>, an affiliate of OMICS.</p>
<p>Beall had been reporting this changing landscape of predatory publishing, and I suspect this is where the loss of his blog will have the greatest impact. That said, Beall’s archived list will long remain a valuable resource. And perhaps most importantly, he made the community aware of the threat of predatory academic publishing.</p><img src="https://counter.theconversation.com/content/71613/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Michael J. I. Brown receives research funding from the Australian Research Council and Monash University, and has developed space-related titles for Monash University's MWorld educational app.
</span></em></p>A leading website that monitored predatory open access journals has closed. This will make it harder to keep tabs on this corrosive force within science.Michael J. I. Brown, Associate professor, Monash UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/679722016-11-15T02:54:59Z2016-11-15T02:54:59ZPeer review is in crisis, but should be fixed, not abolished<figure><img src="https://images.theconversation.com/files/145895/original/image-20161114-5075-12n06hz.jpg?ixlib=rb-1.1.0&rect=0%2C427%2C4469%2C2966&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">More is less in the world of research publications.</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-308642543/stock-photo-stack-of-papers-and-glasses-lying-on-table-desaturated.html">Desktop image via www.shutterstock.com.</a></span></figcaption></figure><p>This year three Nobel Prize-winning biologists broke with tradition and <a href="http://www.nytimes.com/2016/03/16/science/asap-bio-biologists-published-to-the-internet.html?_r=0">published their research directly on the internet</a> as so-called preprints. Their motivation? Saving time.</p>
<p>Traditionally, scientific studies are published in peer-reviewed journals, which require other scientists to evaluate submitted research to determine its soundness for publication. Peer review is supposed to be a good thing, in theory acting as a stopgap for science that isn’t sound, but it’s increasingly <a href="http://www.vox.com/2015/12/7/9865086/peer-review-science-problems">getting a bad rap</a>. Beyond the time it takes to actually get the science done, peer review has become the slowest step in the process of sharing studies. Cycles of peer review-revise-resubmit in biology can span <a href="https://scirev.sc/">months to more than a year</a> for a single manuscript. This situation hampers progress because it delays how long it takes for breakthroughs to become available to other scientists and the public. </p>
<p>How did things get so bad? It’s all about competition, supply and demand. Modern science is done in the context of a <a href="http://doi.org/10.1126/science.1067477">tournament mentality</a>, with a large number of competitors (scientists) vying for a small number of prizes (jobs, tenure, funding). To be competitive, scientists must prove their “worth” through publications, and this pressure has created unanticipated challenges in how scientists report their own work and evaluate that of others – ultimately resulting in unacceptable delays in sharing sound science. </p>
<p>But trying to bypass this traditional route for sharing scientific results is not likely to advance scientific progress. As a journal editor and practicing scientist, I suggest we need to fix the real problem: our standards for publication. Done right, a recalibration would lead to fewer research papers – but that counterintuitive outcome may be exactly what’s needed to more efficiently advance scientific progress. </p>
<h2>More money, more journals, more problems</h2>
<p>Between 1995 and 2003, the U.S. National Institutes of Health’s budget <a href="https://www.nih.gov/about-nih/what-we-do/nih-almanac/appropriations-section-1">increased by 2.4-fold</a>. With more research being funded, publishers expanded the number of journals dedicated to biomedical research and the number of studies published by <a href="https://www.nlm.nih.gov/bsd/index_stats_comp.html">twofold</a>, creating a <a href="http://doi.org/10.1038/495426a">US$9.4 billion scientific publishing industry</a>.</p>
<p>But while the numbers have all increased in proportion, the quality has not. Scientific journals have a pecking order, and more “prestigious” journals are thought to have <a href="https://www.theguardian.com/commentisfree/2013/dec/09/how-journals-nature-science-cell-damage-science">higher standards for publication</a>. These standards are based on a hazy mix of perceived quality of the work, its potential to significantly influence thinking in the field and the <a href="http://www.ascb.org/on-publishing-and-the-sneetches-a-wake-up-call-november-december-2016-newsletter/">possibly unfounded reputation</a> of the journal itself.</p>
<p>How one ranks journal prestige is the subject of heated debate, but <a href="https://www.theguardian.com/commentisfree/2013/dec/09/how-journals-nature-science-cell-damage-science">one flawed and pervasive metric</a> is the <a href="http://wokinfo.com/essays/impact-factor/">impact factor</a>. The impact factor of a journal reflects the number of times publications in that journal are cited by other scientific publications. It’s often used by other scientists as a shorthand measure of recognition of published work.</p>
<p><a href="http://thomsonreuters.com/en/products-services/scholarly-scientific-research/scholarly-search-and-discovery/web-of-science.html">Between 1997 and 2014</a>, the number of journals publishing basic biological research increased by 212, but only four of these journals ranked in the top half of the impact factor scale. If one overlooks the flaws of the metric, these new journals may be seen as publishing work of perhaps lesser quality and limited impact. Indeed, I was told by a senior colleague when I was just starting my career that “a manuscript, once written, will be published somewhere,” insinuating that the quality of the work was irrelevant.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/145889/original/image-20161114-5091-nurlnz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/145889/original/image-20161114-5091-nurlnz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/145889/original/image-20161114-5091-nurlnz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/145889/original/image-20161114-5091-nurlnz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/145889/original/image-20161114-5091-nurlnz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/145889/original/image-20161114-5091-nurlnz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/145889/original/image-20161114-5091-nurlnz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/145889/original/image-20161114-5091-nurlnz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">It’s a rush to turn results into publications.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/nihgov/22364486194">National Eye Institute, National Institutes of Health</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<p>The proliferation of “low impact” scientific journals has also expanded the “publish or perish” mantra of academia. It now matters not only how much you publish but also where you publish. This striving for exclusivity allows “top tier” journals to demand even more from scientists, who are willing to extend their studies beyond what was previously considered a standalone report (the so-called “<a href="http://doi.org/10.1126/science.7008199">least publishable unit</a>”) for the prize of a “good” publication.</p>
<p>For example, one analysis revealed that journal manuscripts published in 2014 <a href="http://doi.org/10.1073/pnas.1511912112">contained significantly more data</a> than those published in 1984. Producing more data takes longer and delays the release of studies that would previously have been considered complete. For instance, Ph.D. students are spending an average of <a href="http://doi.org/10.1073/pnas.1511912112">1.3 years longer</a> at one top graduate program over the same period. </p>
<p>And this high bar is elevated even further when individual journals reduce the numbers of studies that they publish.</p>
<h2>Buried in a barrage of papers</h2>
<p>The overall increased number of papers being written has also created a bottleneck in peer review, which negatively affects both quality and speed of publication.</p>
<p>I spend most of my editorial time trying to recruit qualified reviewers, <a href="http://doi.org/10.1002/leap.1006">who are increasingly too busy</a> to fulfill this professional responsibility. There’s no restriction on how far down the list I’m permitted to go in my attempts. When I receive invitations myself, they now often give me the option to choose people in my lab group to complete the review on their own, expanding the scope of “peer” to include “student.” I have also recently been invited, with no obvious check of my credentials, to join a service that will pay me to review manuscripts, a divergence from the norm, where reviewing papers has traditionally been considered part of an academic’s responsibility to the field and thus unpaid.</p>
<p>With this erosion of the peer review system, <a href="https://www.theguardian.com/science/2016/mar/07/hand-of-god-scientific-plos-one-anatomy-paper-citing-a-creator-retracted-after-furore">spectacular failures</a> are inevitable, such as the study crediting a divine “Creator” for the link between the structure of the hand and its grasping ability in a peer-reviewed publication.</p>
<p>Even without the explosion of preprints that <a href="https://grants.nih.gov/grants/rfi/rfi.cfm?ID=60">may be on the horizon</a>, scientists are having a hard time keeping up with the literature as it is. In a survey by the magazine The Scientist on the <a href="http://www.the-scientist.com/?articles.view/articleNo/27503/title/Citation-amnesia--The-results/">prevalence of omitted references</a>, 85 percent of respondents said the failure to cite previous studies in new publications is a serious or potentially serious problem in the life sciences. This slip in keeping current may lead to the <a href="http://doi.org/10.1001/jama.298.21.2517">persistence of incorrect conclusions</a> and to duplicated and therefore wasted effort. I recently reviewed a manuscript and pointed out that the vast majority of what was reported had been previously published, although none of the three other reviewers made this connection.</p>
<p>Thus, calls for self-publishing need to take scale into account; it may work for physics and mathematics, but in 2015 there were sixfold and 24-fold <a href="https://www.nsf.gov/statistics/2016/nsb20161/uploads/1/nsb20161.pdf">more manuscripts published in biology</a> than in either field, respectively. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/145890/original/image-20161114-5075-19xer9o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/145890/original/image-20161114-5075-19xer9o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/145890/original/image-20161114-5075-19xer9o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/145890/original/image-20161114-5075-19xer9o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/145890/original/image-20161114-5075-19xer9o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/145890/original/image-20161114-5075-19xer9o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/145890/original/image-20161114-5075-19xer9o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/145890/original/image-20161114-5075-19xer9o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">How to keep up?</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/realplastictrees/4003761256">Neal Patel</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<h2>Keeping sight of the goal: Facilitating scientific progress</h2>
<p>Without question, scientific advances, funded by the public, should be shared without delay, a goal championed by the <a href="http://asapbio.org">#ASAPbio</a> movement. Indeed, reporting observations quickly for other scientists to use may seem like a good way to facilitate progress; but in reality, context is everything. There’s simply no way to remember the vast number of details if they’re not associated with a breakthrough in understanding. It’s these breakthroughs that provide a framework for not only organizing the details but vetting their accuracy. As a practical example, I know what I was wearing (detail) on Oct. 28, 2007, the day my son was born (context), but I have no idea what I wore (out-of-context detail) on Oct. 27.</p>
<p>To realize a faster pace of scientific progress, we need to balance the goal of sharing data with an assessment of quality and impact. Proponents of self-publication on internet servers such as <a href="http://biorxiv.org/">bioRxiv</a> suggest that scientists are so <a href="http://doi.org/10.1073/pnas.1511912112">concerned with their reputations that they will not release unsound studies</a>, but the increasing prevalence of <a href="http://www.vox.com/2016/3/24/11299102/scientific-retractions-are-on-the-rise">retracted peer-reviewed articles</a>, <a href="http://doi.org/10.1038/nature.2015.17711">irreproducible results</a> and <a href="http://doi.org/10.1073/pnas.1415135111">text reuse</a> argues that the pressures of the tournament can sometimes trump individual restraint. </p>
<p>Peer review clearly isn’t perfect, but rather than simply bypassing it and releasing even more information into an overloaded system, we should focus on making it better. The first step is to reset and clearly state our standards for quality in both publishing and peer reviewing. The outcome will certainly be fewer publications in biomedicine, but their individual impact will be greater. As a result, scholars will have a fighting chance to dedicate more time to evaluating new research and keeping up with the literature, which will facilitate progress.</p>
<p>Scientists and journals have driven the more-is-better mentality and don’t have the incentives to make these corrections. Instead, universities and granting agencies, which use publications as standards for evaluation, and <a href="http://doi.org/10.1038/532306a">the public</a>, which funds much of the research, must lead the charge to develop a mechanism for journal accreditation, with clear standards for publication and peer-review quality. If publishing scientific advances is worth doing, it is worth doing right.</p><img src="https://counter.theconversation.com/content/67972/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tricia Serio is an Associate Editor at PLOS Genetics and guest editor at PLOS Biology and PNAS. She also reviews manuscripts for the following journals: Genes and Development, Nature Cell Biology, Molecular and Cellular Biology, Nature, Journal of Biological Chemistry, Molecular Biology of the Cell, Trends in Genetics, Journal of Molecular Evolution, PLOS One, Journal of Molecular Biology, Current Genetics, PNAS, Molecular Cell, EMBOJ, Molecular Microbiology, Genetics, Prion, Science, Gene, Yeast, and Nature Chemical Biology.</span></em></p>The traditional mode of publishing scientific research faces much criticism – primarily for being too slow and sometimes shoddily done. Maybe fewer publications of higher quality is the way forward.Tricia Serio, Professor and Department Head in Molecular and Cellular Biology, University of ArizonaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/656192016-09-21T00:01:46Z2016-09-21T00:01:46ZWhy isn’t science better? Look at career incentives<figure><img src="https://images.theconversation.com/files/138450/original/image-20160920-11131-1alomb3.jpg?ixlib=rb-1.1.0&rect=49%2C65%2C5289%2C3660&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Experiment design affects the quality of the results.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/iaea_imagebank/8147632150">IAEA Seibersdorf Historical Images</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span></figcaption></figure><p>There are often substantial gaps between the idealized and actual versions of those people whose work involves providing a social good. Government officials are supposed to work for their constituents. Journalists are supposed to provide unbiased reporting and penetrating analysis. And scientists are supposed to relentlessly probe the fabric of reality with the most rigorous and skeptical of methods. </p>
<p>All too often, however, what should be just isn’t so. In a number of scientific fields, <a href="https://www.washingtonpost.com/news/speaking-of-science/wp/2015/08/28/no-sciences-reproducibility-problem-is-not-limited-to-psychology/">published findings turn out not to replicate</a>, or to have smaller effects than, what was initially purported. Plenty of science does replicate – meaning the experiments turn out the same way when you repeat them – but the amount that doesn’t is too much for comfort.</p>
<p>Much of science is about identifying relationships between variables. For example, how might certain genes increase the risk of acquiring certain diseases, or how might certain parenting styles influence children’s emotional development? To our disappointment, there are no tests that allow us to perfectly sort true associations from spurious ones. Sometimes we get it wrong, even with the most rigorous methods.</p>
<p>But there are also ways in which scientists increase their chances of getting it wrong. Running studies with small samples, mining data for correlations and forming hypotheses to fit an experiment’s results after the fact are <a href="http://fivethirtyeight.com/features/science-isnt-broken/">just some of the ways</a> to <a href="http://doi.org/10.1038/526182a">increase the number of false discoveries</a>. </p>
<p>It’s not like we don’t know how to do better. Scientists who study scientific methods have known about <a href="http://doi.org/10.1086/288135">feasible remedies for decades</a>. Unfortunately, their advice often falls on deaf ears. Why? Why aren’t scientific methods better than they are? In a word: incentives. But perhaps not in the way you think. </p>
<h2>Incentives for ‘good’ behavior</h2>
<p>In the 1970s, <a href="https://en.wikipedia.org/wiki/Campbell%27s_law">psychologists</a> and <a href="https://en.wikipedia.org/wiki/Goodhart%27s_law">economists</a> began to point out the danger in relying on quantitative measures for social decision-making. For example, when public schools are evaluated by students’ performance on standardized tests, teachers respond by teaching “to the test” – at the expense of broader material more important for critical thinking. In turn, the test serves largely as a measure of how well the school can prepare students for the test.</p>
<p>We can see this principle – often summarized as “when a measure becomes a target, it ceases to be a good measure” – playing out in the realm of research. Science is a competitive enterprise. There are <a href="http://doi.org/10.1038/520144a">far more credentialed scholars and researchers</a> than there are university professorships or comparably prestigious research positions. Once someone acquires a research position, there is additional competition for tenure, grant funding, and support and placement for graduate students. Due to this competition for resources, scientists must be evaluated and compared. How do you tell if someone is a good scientist?</p>
<p>An oft-used metric is the number of publications one has in peer-reviewed journals, as well as the status of those journals (along with related metrics, such as the <a href="https://en.wikipedia.org/wiki/H-index"><em>h</em>-index</a>, which purports to measure the rate at which a researcher’s work is cited by others). Metrics like these make it straightforward to compare researchers whose work may otherwise be quite different. Unfortunately, this also makes these numbers susceptible to exploitation. </p>
<p>If scientists are motivated to publish often and in high-impact journals, we might expect them to actively try to game the system. And certainly, some do – as seen in recent high-profile cases of scientific fraud (including in <a href="https://en.wikipedia.org/wiki/Sch%C3%B6n_scandal">physics</a>, <a href="http://www.nytimes.com/2013/04/28/magazine/diederik-stapels-audacious-academic-fraud.html">social psychology</a> and <a href="http://onlinelibrary.wiley.com/doi/10.1111/bcp.12992/full">clinical pharmacology</a>). If malicious fraud is the prime concern, then perhaps the solution is simply heightened vigilance.</p>
<p>However, most scientists are, I believe, genuinely interested in learning about the world, and honest. The problem with incentives is they can shape cultural norms without any intention on the part of individuals. </p>
<h2>Cultural evolution of scientific practices</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/138454/original/image-20160920-11090-684nc6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/138454/original/image-20160920-11090-684nc6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/138454/original/image-20160920-11090-684nc6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=784&fit=crop&dpr=1 600w, https://images.theconversation.com/files/138454/original/image-20160920-11090-684nc6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=784&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/138454/original/image-20160920-11090-684nc6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=784&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/138454/original/image-20160920-11090-684nc6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=986&fit=crop&dpr=1 754w, https://images.theconversation.com/files/138454/original/image-20160920-11090-684nc6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=986&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/138454/original/image-20160920-11090-684nc6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=986&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Scientists work within a culture of research.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/iaea_imagebank/8199500456">IAEA</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>In a <a href="http://rsos.royalsocietypublishing.org/lookup/doi/10.1098/rsos.160384">recent paper</a>, anthropologist <a href="http://xcelab.net/rm/">Richard McElreath</a> and I considered the incentives in science through the lens of <a href="http://www.oxfordbibliographies.com/view/document/obo-9780199766567/obo-9780199766567-0038.xml">cultural evolution</a>, an emerging field that draws on ideas and models from evolutionary biology, epidemiology, psychology and the social sciences to understand cultural organization and change.</p>
<p>In our analysis, we assumed that methods associated with greater success in academic careers will, all else equal, tend to spread. The spread of more successful methods requires no conscious evaluation of how scientists do or do not “game the system.” </p>
<p>Recall that publications, particularly in high-impact journals, are the currency used to evaluate decisions related to hiring, promotions and funding. Studies that show large and surprising associations tend to be favored for publication in top journals, while small, unsurprising or complicated results are more difficult to publish.</p>
<p>But <a href="http://dx.doi.org/10.1371/journal.pmed.0020124">most hypotheses are probably wrong</a>, and performing rigorous tests of novel hypotheses (as well as coming up with good hypotheses in the first place) takes time and effort. Methods that boost false positives (incorrectly identifying a relationship where none exists) and overestimate effect sizes will, on average, allow their users to publish more often. In other words, when novel results are incentivized, methods that produce them – by whatever means – at the fastest pace will become implicitly or explicitly encouraged.</p>
<p>Over time, those shoddy methods will become associated with success, and they will tend to spread. The argument can extend beyond norms of questionable research practices to norms of misunderstanding, if those misunderstandings lead to success. For example, despite over a century of common usage, the <em>p</em>-value, a standard measure of statistical significance, is still <a href="http://dx.doi.org/10.1080/00031305.2016.1154108">widely misunderstood</a>.</p>
<p>The cultural evolution of shoddy science in response to publication incentives requires no conscious strategizing, cheating or loafing on the part of individual researchers. There will always be researchers committed to rigorous methods and scientific integrity. But as long as institutional incentives reward positive, novel results at the expense of rigor, the rate of bad science, on average, will increase. </p>
<h2>Simulating scientists and their incentives</h2>
<p>There is ample evidence suggesting that publication incentives have been negatively shaping scientific research for decades. The frequency of the words <a href="http://dx.doi.org/10.1136/bmj.h6467">“innovative,” “groundbreaking” and “novel”</a> in biomedical abstracts increased by 2,500 percent or more over the past 40 years. Moreover, researchers often <a href="http://dx.doi.org/10.1126/science.1255484">don’t report when hypotheses fail to generate positive results</a>, lest reporting such failures hinders publication.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/138455/original/image-20160920-11127-ntmb9h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/138455/original/image-20160920-11127-ntmb9h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/138455/original/image-20160920-11127-ntmb9h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=736&fit=crop&dpr=1 600w, https://images.theconversation.com/files/138455/original/image-20160920-11127-ntmb9h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=736&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/138455/original/image-20160920-11127-ntmb9h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=736&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/138455/original/image-20160920-11127-ntmb9h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=925&fit=crop&dpr=1 754w, https://images.theconversation.com/files/138455/original/image-20160920-11127-ntmb9h.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=925&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/138455/original/image-20160920-11127-ntmb9h.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=925&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">There doesn’t need to be anything nefarious going on for scientists to stick with the suboptimal methods that help them get ahead.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/iaea_imagebank/8198415199">IAEA</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>We reviewed <a href="http://www.statisticsdonewrong.com/power.html">statistical power</a> in the social and behavioral science literature. Statistical power is a quantitative measurement of a research design’s ability to identify a true association when present. The simplest way to increase statistical power is to increase one’s sample size – which also lengthens the time needed to collect data. Beginning in the 1960s, there have been <a href="http://datacolada.org/wp-content/uploads/2013/10/3416-Sedlmeier-Gigerenzer-Psych-Bull-1989-Do-studies-of-statistical-power-have-an-effect-on-the-power-of-studies.pdf">repeated outcries that statistical power is far too low</a>. Nevertheless, we found that statistical power, on average, <a href="http://rsos.royalsocietypublishing.org/lookup/doi/10.1098/rsos.160384">has not increased</a>.</p>
<p>The evidence is suggestive, but it is not conclusive. To more systematically demonstrate the logic of our argument, we built a computer model in which a population of research labs studied hypotheses, only some of which were true, and attempted to publish their results.</p>
<p>As part of our analysis, we assumed that each lab exerted a characteristic level of “effort.” Increasing effort lowered the rate of false positives, and also lengthened the time between results. As in reality, we assumed that novel positive results were easier to publish than negative results. All of our simulated labs were totally honest: they never cheated. However, labs that published more were more likely to have their methods “reproduced” in new labs – just as they would be in reality as students and postdocs leave successful labs where they trained and set up their own labs. We then allowed the population to evolve.</p>
<p>The result: Over time, effort decreased to its minimum value, and the rate of false discoveries skyrocketed. </p>
<p>And replication – while a crucial tool for generating robust scientific theories – isn’t going to be science’s savior. Our simulations indicate that more replication won’t stem the evolution of bad science.</p>
<h2>Taking on the system</h2>
<p>The bottom-line message from all this is that it’s not sufficient to impose high ethical standards (assuming that were possible), nor to make sure all scientists are informed about best practices (though spreading awareness is certainly one of our goals). A culture of bad science can evolve as a result of institutional incentives that prioritize simple quantitative metrics as measures of success. </p>
<p>There are indications that the situation is improving. Journals, organizations, and universities are increasingly emphasizing <a href="http://www.psychologicalscience.org/index.php/replication">replication</a>, <a href="https://royalsociety.org/journals/ethics-policies/data-sharing-mining/">open data</a>, <a href="http://blogs.plos.org/everyone/2015/02/25/positively-negative-new-plos-one-collection-focusing-negative-null-inconclusive-results/">the publication of negative results</a> and more <a href="https://www.idrc.ca/sites/default/files/sp/Documents%20EN/Research-Quality-Plus-A-Holistic-Approach-to-Evaluating-Research.pdf">holistic evaluations</a>. Internet applications such as <a href="https://twitter.com/lakens/status/774953862012755968">Twitter</a> and <a href="https://www.youtube.com/watch?v=WFv2vS8ESkk&list=PLDcUM9US4XdMdZOhJWJJD4mDBMnbTWw_z">YouTube</a> allow education about best practices to propagate widely, along with spreading norms of holism and integrity. </p>
<p>There are also signs that the old ways are far from dead. For example, one regularly hears researchers discussed in terms of how much or where they publish. The good news is that as long as there are smart, interesting people doing science, there will always be some good science. And from where I sit, there is still quite a bit of it.</p><img src="https://counter.theconversation.com/content/65619/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Paul Smaldino does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Embracing more rigorous scientific methods would mean getting science right more often than we currently do. But the way we value and reward scientists makes this a challenge.Paul Smaldino, Assistant Professor of Cognitive and Information Sciences, University of California, MercedLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/588422016-06-13T09:55:46Z2016-06-13T09:55:46ZPersonal beliefs versus scientific innovation: getting past a flat Earth mentality<figure><img src="https://images.theconversation.com/files/126190/original/image-20160610-29238-i1szs1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Can new ideas break through preconceived notions?</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic.mhtml?id=423739129&src=id">Light bulb image via www.shutterstock.com.</a></span></figcaption></figure><p>The history of science is also a history of people resisting new discoveries that conflict with conventional wisdom.</p>
<p>When Galileo promoted Copernicus’ theory that the <a href="http://earthobservatory.nasa.gov/Features/OrbitsHistory/">Earth revolves around the sun</a> – counter to church doctrine about the Earth being the center of the universe – he wound up condemned by the Roman Inquisition in 1633. <a href="http://www.pewforum.org/2009/02/04/darwin-and-his-theory-of-evolution/">Charles Darwin’s theory of evolution</a> – that new species develop as a result of natural selection on inherited traits – ran into opposition because it contradicted long-held scientific, political and religious beliefs. <a href="http://www.scientus.org/Wegener-Continental-Drift.html">Alfred Wegener’s 1912 proposal</a> that Earth’s continents move relative to each other – the theory of continental drift – was rejected for decades, in part because scientists held fast to the traditional theories they’d spent careers developing.</p>
<p>These kinds of examples aren’t only historical, unfortunately. We’re used to hearing about how the general public <a href="http://www.pewinternet.org/2015/09/10/what-the-public-knows-and-does-not-know-about-science/">can be dense about science</a>. You might expect some portion of everyday folks to take their time <a href="http://www.pewresearch.org/fact-tank/2015/01/29/5-key-findings-science/">coming around on truly groundbreaking ideas</a> that run counter to what they’ve always thought.</p>
<p>But scientists, too, hold their own personal beliefs – by definition, based on old ways of thinking – that may be holding back the innovation that’s at the heart of science. And that’s a problem. It’s one thing for an average Joe to resist evolving scientific theories. It’s quite another if a scientist’s preconceived notions holds us back from discovering the new and unknown – whether that’s a cure for Zika or a cutting-edge technology to combat climate change.</p>
<h2>Personal beliefs as publication roadblocks</h2>
<p>Real scientific progress occurs when laboratory or field research is reported to the public. With luck, the finding is accepted and put into practice, cures are developed, social policies are instituted, educational practices are improved and so on.</p>
<p>This usually occurs though publication of the research in scientific journals. There’s an important step between the lab and publication that laypeople may not know about – the evaluation of the research by other scientists. These other scientists are peers of the researcher, typically working in a closely related area. This middle step is commonly referred to as <a href="https://www.elsevier.com/reviewers/what-is-peer-review">peer review</a>.</p>
<p>In a perfect world, <a href="http://olabout.wiley.com/WileyCDA/Section/id-828027.html">peer review</a> is supposed to determine if the study is solid, based on the quality of the research. It’s meant to be an unbiased evaluation of whether the findings should be reported via journal publication. This important step prevents sloppy research from reaching the public.</p>
<p>However, in the real world, scientists are human beings and are often biased. They let their own beliefs influence their peer reviews. For example, numerous reports indicate that scientists rate research more favorably if the <a href="http://doi.org/10.1007/s12144-010-9087-5">findings agree with their prior beliefs</a>. Worst of all, these prior beliefs often have nothing to do with science but are simply the scientists’ personal views.</p>
<h2>‘But that’s counter to what I thought…’</h2>
<p>How is this a problem for scientific innovation? Let’s look at how some personal beliefs could prevent innovative science from reaching the public.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/125992/original/image-20160609-7059-11xg2px.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/125992/original/image-20160609-7059-11xg2px.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/125992/original/image-20160609-7059-11xg2px.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=235&fit=crop&dpr=1 600w, https://images.theconversation.com/files/125992/original/image-20160609-7059-11xg2px.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=235&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/125992/original/image-20160609-7059-11xg2px.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=235&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/125992/original/image-20160609-7059-11xg2px.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=296&fit=crop&dpr=1 754w, https://images.theconversation.com/files/125992/original/image-20160609-7059-11xg2px.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=296&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/125992/original/image-20160609-7059-11xg2px.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=296&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">What if she’s on the path to a revolutionary idea?</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/ciat/4331056560">CIAT</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p><strong><em>“Minorities aren’t good at STEM.”</em></strong> The stereotype that “<a href="http://www.sciencemag.org/news/2014/03/both-genders-think-women-are-bad-basic-math">women are not good at math</a>” is commonly held – and also happens to be incorrect. If a scientist holds this personal belief, then he is likely to judge any research done by women in STEM (Science, Technology, Engineering and Mathematics) more negatively – not because of its quality, but because of his own personal belief.</p>
<p>For instance, some studies have shown that female STEM applicants in academia are <a href="http://doi.org/10.1073/pnas.1211286109">judged more harshly than their male counterparts</a>. Because of this gender bias, it may take a female STEM researcher more time and effort before her work reaches the public.</p>
<p>Some racial minorities face similar kinds of bias. For example, one study found that <a href="http://doi.org/10.1126/science.1196783">black applicants are less likely to receive research funding</a> from the U.S. National Institutes of Health than equivalently qualified whites. That’s a major roadblock to these researchers advancing their work.</p>
<p><strong><em>“Comic books are low-brow entertainment for kids.”</em></strong> Here’s an example from my own area of expertise. </p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/126188/original/image-20160610-29216-3ofw6b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/126188/original/image-20160610-29216-3ofw6b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/126188/original/image-20160610-29216-3ofw6b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=792&fit=crop&dpr=1 600w, https://images.theconversation.com/files/126188/original/image-20160610-29216-3ofw6b.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=792&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/126188/original/image-20160610-29216-3ofw6b.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=792&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/126188/original/image-20160610-29216-3ofw6b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=995&fit=crop&dpr=1 754w, https://images.theconversation.com/files/126188/original/image-20160610-29216-3ofw6b.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=995&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/126188/original/image-20160610-29216-3ofw6b.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=995&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Does this look like a legitimate area of inquiry to you? Analysis of comic book images can yield new insights into how we perceive.</span>
</figcaption>
</figure>
<p><a href="http://doi.org/10.1386/stic.5.1.57_1">Comic book research</a> is a relatively recent area of study. Perhaps because of this, innovative findings in psychology have been <a href="http://doi.org/10.16995/cg.71">discovered by analyzing comic book images</a>.</p>
<p>However, people often believe that comic books are just <a href="http://www.peterlang.com/index.cfm?event=cmp.ccc.seitenstruktur.detailseiten&seitentyp=produkt&pk=46531&concordeid=68892">low-brow entertainment for kids</a>. If a scientist holds this personal belief, then she’s likely to judge any psychology research using comic books more negatively. Because of this, scientists like me who focus on comic books may not be able to publish in the most popular psychology journals. As a result, fewer people will ever see this research.</p>
<p><strong><em>“The traditional ways are the best ways.”</em></strong> A final example is a personal belief that directly counters scientific innovation. Often, scientists believe that traditional methods and techniques are better than any newly proposed approaches.</p>
<p>The history of psychology supplies one example. Behaviorism was psychology’s dominant school of thought for the first part of the 20th century, relying on observed behavior to provide insights. Its devotees rejected new techniques for studying psychology. During behaviorism’s reign, any talk of internal processes of the mind was considered taboo. One of the pioneers of the subsequent cognitive revolution, George A. Miller, said “<a href="http://www.nytimes.com/2012/08/02/us/george-a-miller-cognitive-psychology-pioneer-dies-at-92.html?pagewanted=all&_r=0">using ‘cognitive’ was an act of defiance</a>.” Luckily for us, he <em>was</em> defiant and published <a href="http://dx.doi.org/10.1037/h0043158">one of the most highly cited papers in psychology</a>.</p>
<p>If a scientist believes the way we’ve always done things in the lab is best, then she’ll judge any research done using novel approaches more negatively. Because of this, <a href="http://dx.doi.org/10.2139/ssrn.2710572">highly innovative work</a> is rarely published in the best scientific journals and is often recognized only after considerable delay.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/125993/original/image-20160609-7093-19zg7zc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/125993/original/image-20160609-7093-19zg7zc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/125993/original/image-20160609-7093-19zg7zc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=402&fit=crop&dpr=1 600w, https://images.theconversation.com/files/125993/original/image-20160609-7093-19zg7zc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=402&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/125993/original/image-20160609-7093-19zg7zc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=402&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/125993/original/image-20160609-7093-19zg7zc.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=505&fit=crop&dpr=1 754w, https://images.theconversation.com/files/125993/original/image-20160609-7093-19zg7zc.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=505&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/125993/original/image-20160609-7093-19zg7zc.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=505&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">We know our planet is round. But are we missing out on other innovative ideas?</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/scratchpost/7356754296">Jaya Ramchandani</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>How is this a problem for scientific progress?</h2>
<p>Almost by definition, the most important and innovative scientific findings often go against people’s existing beliefs. If research that conforms to personal beliefs is favored, then any research that is based on new ideas runs the risk of being passed over. It takes a leap to imagine a round Earth when everyone’s always believed it to be flat.</p>
<p>When old ideas rule the day, scientific progress stalls. And as our <a href="http://www.penguin.com/book/the-age-of-spiritual-machines-by-ray-kurzweil/9780140282023">world changes at an ever faster pace</a>, we need innovative thinking to face the coming challenges.</p>
<p>How can scientists stop their personal beliefs from impeding scientific progress? Completely removing personal beliefs from these contexts is impossible. But we can work to change our beliefs so that, instead of hampering scientific progress, they encourage it. Many studies have outlined <a href="http://doi.org/10.1080/0163853X.2015.1026680">possible ways to modify beliefs</a>. It’s up to scientists, and indeed society as well, to begin to examine their own beliefs and change them for the better.</p>
<p>After all, we don’t want to delay the next revolutionary idea in climate science, pioneering cure for cancer, or dazzling discovery in astronomy just because we can’t see past our original beliefs.</p><img src="https://counter.theconversation.com/content/58842/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Igor Juricevic does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The very goal of science, to discover the new and unknown, is hampered by any outdated personal beliefs scientists hold.Igor Juricevic, Assistant Professor of Psychology (Perception and Cognition), Indiana UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/598712016-06-02T01:04:41Z2016-06-02T01:04:41ZAccurate science or accessible science in the media – why not both?<figure><img src="https://images.theconversation.com/files/124855/original/image-20160601-1951-sdxq1j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Scientists themselves may be the key to finding the right balance.</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-342000797/stock-photo-scales-on-wooden-background.html">Scales image via www.shutterstock.com.</a></span></figcaption></figure><p>Every day, millions of people take to search engines with common concerns, such as “How can I lose weight?” or “How can I be productive?” In return, they find articles that offer simple advice and quick solutions, supposedly based on what “studies have shown.”</p>
<p>A closer look at these articles, however, reveals a troubling absence of scientific rigor. Few bother to cite research or discuss studies’ methodologies or limitations. The <a href="http://doi.org/10.1093/embo-reports/kvf225">authors seldom have scientific training</a>.</p>
<p>As young scientists from four diverse fields (psychology, chemistry, physics and neuroscience), we’ve noticed that much writing about science, particularly on topics most relevant to the daily lives of readers, is currently failing to resolve the trade-off between accessibility and accountability. Rigorous findings shared by researchers in specialist journals are obscured behind jargon and paywalls, while accessible science shared on the internet is untrustworthy, unregulated and often click-bait.</p>
<p>If this communication crisis is due to a lack of scientifically literate voices, the solution may be for more scientists to enter the fray. Scientists have the expertise to publicly correct misinterpretations of their and others’ data. By developing new ways to disseminate science knowledge, they can help prevent inaccurate and overhyped stories from gaining traction. We argue that scientists bear a responsibility to reform the way their work is ultimately communicated.</p>
<h2>Science gets lost in translation</h2>
<p>Scientific publication – which operates through an intensive peer review process – is flourishing. In 2014, over <a href="http://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1008&context=scholcom">2.5 million scholarly articles</a> were published on topics that ranged from how to <a href="http://doi.org/10.1038/nature.2015.18965">reduce carbon emissions</a> to how <a href="http://doi.org/10.1177/0956797614557867">Twitter influences the rate of heart disease</a> and how <a href="http://doi.org/10.1038/nrrheum.2014.193">regular exercise can prevent inflammation</a> associated with rheumatic diseases. Because of recent research, we know there’s little evidence that genetically modified vegetables <a href="http://doi.org/10.1016/j.envint.2011.01.003">are unhealthy</a>, and that <a href="http://doi.org/10.1016/j.gloenvcha.2014.02.004">eating less meat</a> is <a href="http://doi.org/10.1007/s10584-014-1104-5">a simple way</a> to positively influence the environment.</p>
<p>These are important messages, and when people don’t hear or listen to them, there can be serious consequences. Misinformed campaigns arise <a href="https://theconversation.com/vaccines-back-in-the-headlines-heres-what-the-experts-say-47815">against vaccinations</a>, and <a href="http://www.usatoday.com/story/news/nation/2014/04/06/anti-vaccine-movement-is-giving-diseases-a-2nd-life/7007955/">near-extinct diseases return</a>. Mental illness remains <a href="https://theconversation.com/inspiration-from-gamers-on-tackling-mental-health-stigma-18769">shamefully stigmatized</a>. Climate change is <a href="https://www.skepticalscience.com/argument.php">dismissed as fiction</a>. People become erroneously convinced that <a href="http://www.buzzfeed.com/tomchivers/bacon-and-sausages-do-cause-cancer-says-the-who#.mxz3wYjge">red meat causes cancer</a> and that <a href="http://io9.gizmodo.com/i-fooled-millions-into-thinking-chocolate-helps-weight-1707251800">eating dark chocolate helps weight loss</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/124856/original/image-20160601-1951-zljpl5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/124856/original/image-20160601-1951-zljpl5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/124856/original/image-20160601-1951-zljpl5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/124856/original/image-20160601-1951-zljpl5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/124856/original/image-20160601-1951-zljpl5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/124856/original/image-20160601-1951-zljpl5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/124856/original/image-20160601-1951-zljpl5.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/124856/original/image-20160601-1951-zljpl5.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">It’s hard for the general public to even access most research journals.</span>
<span class="attribution"><span class="source">Maggie Villiger</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>Rigorous science is locked away</h2>
<p>So how can we ensure that everyone has access to useful science knowledge?</p>
<p>Most scientific articles are aimed at an audience of other experts in highly specific fields, making them ill-suited for popular consumption. Between complex methodological language and frequent acronyms, even scientists have trouble following the <a href="http://arstechnica.com/staff/2013/04/two-sciences-separated-by-a-common-language/">jargon specific to other fields</a>, leaving little hope for those with less scientific training.</p>
<p>An even more pressing issue, however, is that people outside of research institutions can’t even access most journal articles. Many of these papers are <a href="https://www.theguardian.com/science/blog/2013/jan/17/open-access-publishing-science-paywall-immoral">hidden behind a publisher paywall</a>, and nonsubscribers are forced to pay <a href="http://www.theatlantic.com/national/archive/2011/02/read-this-academic-journal-article-but-prepare-to-pay/71536/">US$30-$50 for a single article</a>.</p>
<p>These paywalls are not merely obstructive; we would argue they’re also unethical. Most research is publicly funded, yet taxpayers are charged to consume scientific articles.</p>
<p>Ideally, scientific publishing will transition to healthy open-access journals that serve both researchers and readers. Legislation regarding quasi-monopolistic scientific publishing companies, predatory publishing practices and public access to primary scientific sources would go far to serve this end. </p>
<p>The European Union recently stipulated that all <a href="http://doi.org/10.1126/science.aag0577">publicly funded research articles be freely accessible</a> by 2020, but the United States has not yet passed a similar mandate. Scientists will play a crucial role in calling for and implementing these kinds of changes.</p>
<h2>The public wants accessible science</h2>
<p>As debates over open access continue, people’s desire and need for evidence-based solutions to medical and social dilemmas has not diminished. As a consequence, we see a rising tide of popular science outlets that are more accessible both in content and availability than the research journals some of their content is ostensibly based on. </p>
<p>These platforms range in accuracy, from questionable blogs preaching “7 ways to get happy now” to serious websites and magazines like <a href="http://discovermagazine.com/">Discover</a> and <a href="http://www.americanscientist.org">American Scientist</a>. As part of our own efforts to bridge the divide between accessibility and accuracy, we each contribute content to the nonprofit <a href="http://www.usefulscience.org/">Useful Science</a>, which curates research for the general public through short reviewed summaries and an <a href="http://www.usefulscience.org/podcast">in-depth podcast</a>.</p>
<p>However, even reputable sources are not immune to sensational headlines. In 2012, an article in ScienceNews on female mimicry in snakes was titled “<a href="https://www.sciencenews.org/article/she-male-garter-snakes-some-it-hot">She-male garter snakes: some like it hot</a>.” An article on male sheep neuroendocrinology was headlined “<a href="http://www.washingtonpost.com/wp-dyn/content/article/2007/02/02/AR2007020201462.html">Brokeback mutton</a>” by the Washington Post, and “<a href="http://content.time.com/time/magazine/article/0,9171,1582336,00.html">Yep, they’re gay</a>” by Time. This unfortunate trend in popular science suggests that open-access publishing, even if it does proliferate, would still need to compete with flashier posts that sacrifice strict validity for clicks.</p>
<p>The growth of science communication websites that solicit and address questions and feedback directly and immediately from the general public provides some hope. These include <a href="https://www.quora.com/">Quora</a> and communities on Reddit such as <a href="https://www.reddit.com/r/askscience">AskScience</a>. The popularity of these resources (AskScience has over <a href="https://www.reddit.com/r/askscience/">eight million subscribers</a>) shows that a good portion of the public wants scientific information communicated, on demand, in an accurate and approachable manner. Furthermore, a lack of direct incentive for contributors may make <a href="http://doi.org/10.1038/nn0505-535">content manipulation less likely</a>.</p>
<p>These efforts are laudable but suffer from a lack of accountability – any author can claim to be speaking from a perspective of expertise. Even in the best cases, when authors have training in science or its communication, advice is not scrutinized prior to posting.</p>
<p>There are ways to resolve these problems. Science journalists should solicit feedback from independent experts before publishing. Posts in scientific communities could go through an expedited peer-review process. In all cases, scientists and science communicators should be working together to match the accessibility of their content with accuracy and precision. </p>
<h2>Who will lead the revolution?</h2>
<p>The present state of science communication reveals important work to be done, but no burden of responsibility. </p>
<p>Some responsibility seems to fall on scientific journals, but most journals are profit vehicles, not conscientious individuals. Some seems to fall on media outlets, but many websites and magazines are squeezed by intense competition for ad revenue. Furthermore, reporters are seldom trained to understand science, let alone contribute to the discipline’s evolution.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/124858/original/image-20160601-1425-wv6cbg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/124858/original/image-20160601-1425-wv6cbg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/124858/original/image-20160601-1425-wv6cbg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/124858/original/image-20160601-1425-wv6cbg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/124858/original/image-20160601-1425-wv6cbg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/124858/original/image-20160601-1425-wv6cbg.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/124858/original/image-20160601-1425-wv6cbg.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/124858/original/image-20160601-1425-wv6cbg.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Researchers need to think beyond the lab notebook.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/proteinbiochemist/3167660996">J Biochemist</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<p>The onus, then, is on scientists. There are <a href="https://www.census.gov/prod/2012pubs/acs-18.pdf">20 million people with science or engineering degrees</a> in the United States alone. Instead of passively consuming media with outrageous scientific claims, it should be scientists’ personal responsibility to make research freely available, and to moderate accessible scientific communities so they’re accurate and accountable. Scientists should also work with journalists to set guidelines for media publication, such as a vetting process where popular articles are approved by experts in the field before publication, and should speak up when inaccurate information is disseminated. </p>
<p>It’s time for the scientific community to act; not only as individuals, but also as interdisciplinary groups. If scientists do so, the next generation of science communication vehicles may be coalitions of journalists and researchers (as in <a href="https://theconversation.com/us/who-we-are">The Conversation’s collaborative model</a>) who can disseminate messages that are both exciting and responsible. Science will not only be more interesting and accountable. It will also be more useful.</p><img src="https://counter.theconversation.com/content/59871/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Joshua Conrad Jackson receives funding from the National Science Foundation. He is affiliated with the non-profit organization Useful Science (usefulscience.org). </span></em></p><p class="fine-print"><em><span>Ian Mahar receives funding from Fonds de Recherche du Québec - Santé. He is affiliated with the non-profit organization Useful Science (usefulscience.org). </span></em></p><p class="fine-print"><em><span>Jaan Altosaar founded Useful Science (usefulscience.org) and receives funding from the Natural Sciences and Engineering Council of Canada.</span></em></p><p class="fine-print"><em><span>Michael Gaultois receives funding the European Union's Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 659764. He is affiliated with the non-profit organization Useful Science (usefulscience.org).</span></em></p>The public loses when their only choices are inaccessible, impenetrable journal articles or overhyped click-bait about science. Scientists themselves need to step up and help bridge the divide.Joshua Conrad Jackson, Doctoral Student, Department of Psychology and Neuroscience, University of North Carolina at Chapel HillIan Mahar, Postdoctoral Research Fellow, Neuroscience, Boston UniversityJaan Altosaar, Ph.D. Student in Physics, Princeton UniversityMichael Gaultois, Postdoctoral Researcher in Chemistry, University of CambridgeLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/536772016-02-11T16:05:13Z2016-02-11T16:05:13ZThe logic of journal embargoes: why we have to wait for scientific news<figure><img src="https://images.theconversation.com/files/111203/original/image-20160211-29190-1yx92jl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Extra, extra! The embargo's lifted, read all about it.</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic.mhtml?id=248829895&src=id">Newspapers image via www.shutterstock.com.</a></span></figcaption></figure><p>Rumors were flying through the blogosphere this winter: physicists at the Advanced Laser Interferometer Gravitational-Wave Observatory (<a href="https://www.ligo.caltech.edu/">LIGO</a>) may finally have directly detected <a href="http://www.nature.com/news/gravitational-waves-6-cosmic-questions-they-can-tackle-1.19337">gravitational waves</a>, ripples in the fabric of space-time predicted by Einstein 100 years ago in his general theory of relativity. Gravitational waves were predicted to be produced by cataclysmic events such as the collision of two black holes.</p>
<p>If true, it would be a very big deal: a rare chance for scientists to grab the attention of the public through news of cutting-edge research. So why were the scientists themselves keeping mum?</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"686574829542092800"}"></div></p>
<p>This wouldn’t be the first time scientists thought they had detected gravitational waves. In March 2014, a group claimed to have done so. In that case, scientists announced their discovery when they posted an article in <a href="http://arxiv.org">arXiv</a>, a preprint server where physicists and other scientists share research findings prior to acceptance by a peer-reviewed publications. Turns out that group was <a href="http://www.nature.com/news/gravitational-waves-discovery-now-officially-dead-1.16830">wrong</a> – they were actually looking at galactic dust. </p>
<p>The LIGO scientists were more careful. Fred Raab, head of the LIGO laboratory, <a href="http://www.geekwire.com/2016/after-gravitation-wave-rumors-its-getting-close-to-go-time-for-advanced-ligo-results/">explained</a>:</p>
<blockquote>
<p>As we have done for the past 15 years, we take data, analyze the data, write up the results for publication in scientific journals, and once the results are accepted for publication, we announce results broadly on the day of publication or shortly thereafter. </p>
</blockquote>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"686587441478766592"}"></div></p>
<p>And that’s what they did, timing their news conferences and media outreach to coincide with the <a href="http://physics.aps.org/featured-article-pdf/10.1103/PhysRevLett.116.061102">official publication</a> in the scientific journal Physical Review Letters about their discovery. Why did they delay their public announcement rather than spread the word as widely as possible as soon as possible?</p>
<h2>Science’s standard operating procedure</h2>
<p>Although it may sound unnecessarily cautious, the process Raab described is how most scientists prepare and vet discoveries prior to announcing them to the world – and, indeed, it’s the process most scientific journals insist upon. <em>Nature</em>, for example, <a href="http://www.nature.com/authors/policies/embargo.html">prohibits</a> authors from speaking with the press about a submitted paper until the week before publication, and then only under conditions set by the journal. </p>
<p>Scientific publishing serves both the scientist and the public. It’s a quid pro quo: the authors get to claim priority for the result – meaning they got there before any other scientists did – and in return the public (including competing scientists) gets access to the experimental design, the data and the reasoning that led to the result. Priority in the form of scientific publishing earns scientists their academic rewards, including more funding for their research, jobs, promotions and prizes; in return, they reveal their work at a level of detail that other scientists can build on and ideally replicate and confirm. </p>
<p>News coverage of a scientific discovery is another way for scientists to claim priority, but without the vetted scientific paper right there alongside it, there is no quid pro quo. The claim is without substance, and the public, while titillated, does not benefit – because no one can act on the claim until the scientific paper and underlying data are available.</p>
<p>Thus, most scientific journals insist on a “press embargo,” a time during which scientists and reporters who are given advanced copies of articles agree not to publish in the popular press until the scientific peer review and publishing process is complete. With the advent of <a href="http://www.infotoday.com/searcher/oct00/tomaiuolo&packer.htm">preprint servers</a>, however, this process itself is evolving. </p>
<p><a href="http://dx.doi.org/10.1056/NEJM197706022962204">First introduced</a> in 1977, journal embargoes reflect a scientific journal’s desire both to protect its own <a href="http://dx.doi.org/10.1056/NEJM198110013051408">newsworthiness</a> and to protect the public from misinformation. If a result is wrong (as was the case with the 2014 gravitational wave result), peer review is supposed to catch it. At the least, it means experts other than the researchers themselves examined the experimental design and the data and agreed that the conclusions were justified and the interpretations reasonable. </p>
<p>Often, results are more “nuanced” than the news article or press conference suggests. Yes, this new drug combination makes a (minor) difference, but it doesn’t cure cancer. Finally, the result could be correct, but not because of the data in that paper, and the premature press conference claims an unwarranted priority that can disrupt other research. In all these cases, having access to the research article and the underlying data is critical for the news to be meaningful.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/111035/original/image-20160210-12153-9yc2pi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/111035/original/image-20160210-12153-9yc2pi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/111035/original/image-20160210-12153-9yc2pi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/111035/original/image-20160210-12153-9yc2pi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/111035/original/image-20160210-12153-9yc2pi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/111035/original/image-20160210-12153-9yc2pi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/111035/original/image-20160210-12153-9yc2pi.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/111035/original/image-20160210-12153-9yc2pi.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Peer-reviewed and published.</span>
<span class="attribution"><span class="source">Maggie Villiger</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span>
</figcaption>
</figure>
<h2>Purposes of a press embargo</h2>
<p>A press embargo has additional benefits for the reporter, the journal and the public.</p>
<p>Multiple journalists get an equal chance to publish a well-researched and balanced article. In exchange for respecting the journal’s press embargo, reporters find out what’s being published in advance of publication. This gives multiple journalists a chance to read the scientific article, find experts who can help them make sense of the article, and publish a carefully crafted story. From the scientist’s (and scientific journal’s) perspective, this maximizes the quality and quantity of the coverage by the press.</p>
<p>The public gains access to the scientific article very close to the time they read the news story. The popular press tends to bias a story toward what’s “newsworthy” about it – and that sometimes winds up exaggerating or otherwise inaccurately summarizing the scientific article. When that article relates to human health, for instance, it’s important that doctors have access to the original scientific paper before their patients start inquiring about new treatments they’d heard about in the news.</p>
<p>Other scientific experts gain access to the scientific article as soon as the findings become news. Scientists who jump the gun and allow their research to become news before publication in an academic journal are making unvetted claims that can turn out to be less important once the peer-reviewed article eventually appears.</p>
<p>A press embargo can protect a scientist’s claim for priority in the face of competition from other scientists and journals. Scientists generally accept journal publication dates as indicators of priority – but when a discovery makes news, the journal considering a competitor’s paper often both releases its authors from the embargo and races the paper to publication. And, if your competitor’s paper comes out first, you’ve lost the priority race.</p>
<p>The embargo system allows time for prepublication peer review. Most experiments designed to address research questions are complicated and indirect. Reviewers often require additional experiments or analyses prior to publication. Prepublication peer review can take a long time, and its value <a href="http://dx.doi.org/10.1242/dmm.001388">has been</a> <a href="https://www.theguardian.com/science/occams-corner/2015/sep/07/peer-review-preprints-speed-science-journals">questioned</a>, but it is currently the norm. If a news story came out on the paper while it was under review, the process of peer review could be jeopardized by pressure to “show the data” based on the news article. Many journals would decline publication under those conditions, leaving the authors and public in limbo.</p>
<p>I know of no case in which talking about a discovery in advance of scientific publication helps the public. Yes, “breaking news” is exciting. But journalists and other writers can tell riveting stories about science that convey the excitement of discovery without breaking journal embargoes. And the scientific community can continue to work on speeding its communication with the public while preserving the quid pro quo of scientific publication.</p><img src="https://counter.theconversation.com/content/53677/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Vivian Siegel does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Sometimes big research news bypasses the usual scientific publishing process. Here’s why that’s not good for scientists or the public.Vivian Siegel, Visiting Instructor of Biological Engineering, Massachusetts Institute of Technology (MIT)Licensed as Creative Commons – attribution, no derivatives.