Menu Close

The peer review system is broken. We asked academics how to fix it

The peer review process is a cornerstone of modern scholarship. Before new work is published in an academic journal, experts scrutinise the evidence, research and arguments to make sure they stack up.

However, many authors, reviewers and editors have problems with the way the modern peer review system works. It can be slow, opaque and cliquey, and it runs on volunteer labour from already overworked academics.


Read more: Explainer: what is peer review?


Last month, one of us (Kelly-Ann Allen) expressed her frustration at the difficulties of finding peer reviewers on Twitter. Hundreds of replies later, we had a huge crowd-sourced collection of criticisms of peer review and suggestions for how to make it better.

The suggestions for journals, publishers and universities show there is plenty to be done to make peer review more accountable, fair and inclusive. We have summarised our full findings below.

Three challenges of peer review

We see three main challenges facing the peer review system.

First, peer review can be exploitative.

Many of the companies that publish academic journals make a profit from subscriptions and sales. However, the authors, editors and peer reviewers generally give their time and effort on a voluntary basis, effectively performing free labour.

And while peer review is often seen as a collective enterprise of the academic community, in practice a small fraction of researchers do most of the work. One study of biomedical journals found that, in 2015, just 20% of researchers performed up to 94% of the peer reviewing.

Peer review can be a ‘black box’

The second challenge is a lack of transparency in the peer review process.

Peer review is generally carried out anonymously: researchers don’t know who is reviewing their work, and reviewers don’t know whose work they are reviewing. This provides space for honesty, but can also make the process less open and accountable.

The opacity may also suppress discussion, protect biases, and decrease the quality of the reviews.

Peer review can be slow

The final challenge is the speed of peer review.

When a researcher submits a paper to a journal, if they make it past initial rejection, they may face a long wait for review and eventual publication. It is not uncommon for research to be published a year or more after submission.

This delay is bad for everyone. For policymakers, leaders and the public, it means they may be making decisions based on outdated scientific evidence. For scholars, delays can stall their careers as they wait for the publications they need to get promotions or tenure.


Read more: Journal papers, grants, jobs ... as rejections pile up, it's not enough to tell academics to 'suck it up'


Scholars suggest the delays are typically caused by a shortage of reviewers. Many academics report challenging workloads can discourage them from participating in peer review, and this has become worse since the onset of the COVID-19 pandemic.

It has also been found that many journals rely heavily on US and European reviewers, limiting the size and diversity of the pool of reviewers.

Can we fix peer review?

So, what can be done? Most of the constructive suggestions from the large Twitter conversation mentioned earlier fell into three categories.

First, many suggested there should be better incentives for conducting peer reviews.

This might include publishers paying reviewers (the journals of the American Economic Association already do this) or giving some profits to research departments. Journals could also offer reviewers free subscriptions, publication fee vouchers, or fast-track reviews.

However, we should recognise that journals offering incentives might create new problems.


Read more: Explainer: the ins and outs of peer review


Another suggestion is that universities could do better in acknowledging peer review as part of the academic workload, and perhaps reward outstanding contributors to peer review.

Some Twitter commentators argued tenured scholars should review a certain number of articles each year. Others thought more should be done to support non-profit journals, given a recent study found some 140 journals in Australia alone ceased publishing between 2011 and 2021.

Most respondents agreed that conflicts of interest should be avoided. Some suggested databases of experts would make it easier to find relevant reviewers.

Use more inclusive peer review recruitment strategies

Many respondents also suggested journals can improve how they recruit reviewers, and what work they distribute. Expert reviewers could be selected on the basis of method or content expertise, and asked to focus on that element rather than both.

Respondents also argued journals should do more to tailor their invitations to target the most relevant experts, with a simpler process to accept or reject the offer.

Others felt that more non-tenured scholars, PhD researchers, people working in related industries, and retired experts should be recruited. More peer review training for graduate students and increased representation for women and underrepresented minorities would be a good start.

Rethink double-blind peer review

Some repondents pointed to a growing movement towards more open peer review processes, which may create a more human and transparent approach to reviewing. For example, Royal Society Open Science publishes all decisions, review letters, and voluntary identification of peer reviewers.

Another suggestion to speed up the publishing process was to give higher priority to time-sensitive research.

What can be done?

The overall message from the enormous response to a single tweet is that there is a need for systemic changes within the peer review process.

There is no shortage of ideas for how to improve the process for the benefit of scholars and the broader public. However, it will be up to journals, publishers and universities to put them into practice and create a more accountable, fair and inclusive system.


The authors would like to thank Emily Rainsford, David V. Smith and Yumin Lu for their contribution to the original article Towards improving peer review: Crowd-sourced insights from Twitter.

Want to write?

Write an article and join a growing community of more than 190,800 academics and researchers from 5,056 institutions.

Register now