NHMRC Public Consultations

Skip Navigation and go to Content
Visit NHMRC website

Public Consultation on the NHMRC Draft Principles of Peer Review submission

This submission reflects the views of
Individual Background: 
Researcher – other
Personal Details
First Name: 
Last Name: 
General Comments

Submission to National Health and Medical Research Council Draft Principles of Peer Review for Consultation

Dear Sir/Madam,

We have no objection to our submission or contact details being made publicly available.

Who we are

We are researching the science behind funding peer review, and investigating if there are ways of awarding research funding that are more efficient. Our group are two health economists, a statistician and an epidemiologist. Some of our research is funded by an NHMRC Project Grant awarded in 2011.

“Continuous improvement. Peer review utilises new technologies and best practice in order to maximise the benefits of peer review and minimise individual workloads.”

We welcome the NHMRC’s determination to continually improve their peer review systems.

In terms of workloads, our research on grant peer review has led us to conclude that the current NHMRC peer review systems are unnecessarily time-consuming for researchers. Based on a survey of researchers who submitted a Project Grant in 2012, we estimate that 550 years of chief investigator time were spent preparing proposals, which is 66 million dollars in salary costs. This is a huge annual cost and far more of this time could be spent on actual research. Reducing the time spent preparing grant proposals could be achieved with little impact on peer review by removing those parts of the application that are not used by reviewers.

Freeing up time for Australian researchers would allow them to spend more time on actual research and on grant applications to overseas funding bodies. This would create more scientific outputs for Australia and encourage more foreign investment to Australia.

“Continuous improvement. [...] Peer review is responsive to criticism to minimise weaknesses.”

We believe that there are weaknesses in the current system, and that the best way to address these weaknesses is by using research. Changes to the peer review system should no longer be based on anecdotes, expert opinion or the designs of other funding agencies (unless those designs have been scientifically tested). The NHMRC could use the following methods to drive continuous improvement (methods that are well used by the studies that they fund):

  • Experimental studies to compare alternative peer review methods.
  • Cost-effectiveness analyses to compare the gains in peer review accuracy against monetary and time costs.
  • Surveys of the research community. For example, surveying recent peer reviewers about what sections of the application had no or little influence on their decision. These sections could then be cut from future applications, which would save time for researchers.

Robust evidence from well designed studies is the best way to convince researchers of the need for change, even if this change is unpopular.

For example, we know that using an expression of interest (EOI) has been tried before for Project Grants but was abandoned due to its unpopularity. The accuracy of an EOI system could be tested using a study that examined the same applications but using panels with and without an EOI process. If an EOI process proved to be comparably accurate to the full process then the time savings are potentially huge, as those Australian researchers knocked out in the first round could spend time on other research activities.

If an EOI could be used to reject 30% of proposals, and assuming that an EOI takes one-quarter of the time to prepare as a full proposal, then (based on our recent survey) this would save 124 years of chief investigator time per year. This saved time is equivalent to funding 124 new post-doctoral positions per year.

Another interesting (and often controversial) area of research is conflicts of interest. Currently many of the most qualified people on a Grant Review Panel have to leave the room for even a minor conflict of interest. Anecdotally we know that this has led to situations where the remaining GRP members are not able to properly assess a proposal. The current management of conflicts are therefore impacting on the quality of peer review. Research could help us quantify the degree to which the current management of conflicts impacts on funding decisions, and whether relaxing the rules would improve the peer review process.

The best way to gather solid scientific evidence for continual improvement in peer review is to engage the expertise of the Australian research community. These people have the skills and experience of running large and complex trials, and these skills could easily be applied to studying peer review. We propose that “peer review research” is made into a special initiative, with money set aside for one or two Project Grants per funding round.

We have worked with the NHMRC in the past on researching Project Grants, and we are currently working with the NHMRC on research into Early Career Fellowships. We hope that the NHMRC will continue to use research as part its continuous improvement.

“Continuous improvement. [...] Participants are given training and feedback to help improve their performance.”

Participants are currently not given sufficient feedback to improve their performance. For many NHMRC schemes the reasons for failure are not made clear.

  • Failed applicants for Project Grants only receive: scores in three criteria, their category (1 to 7), and the written feedback from the external reviewer(s).
  • Failed Career Development Fellowship applicants only receive five scores (on a 1–3 scale) and a line or two of feedback.
  • Failed Early Career Fellowship applicants only receive seven scores (on a 1–3 scale) and a line or two of feedback.

Whilst scores may be useful for deciding whether to re-submit a failed proposal, the researchers have little idea of why their proposal failed. This means they must guess about what changes to make before re-submitting. This is often frustrating and disheartening.

The best system of peer review would be to give the maximum amount of feedback possible as this maximises the possibility for improvement. This feedback could include:

  • A de-identified transcript of what was said about the proposal during the panel meeting.
  • The de-identified scores from all panel members and external reviewers.

The current discarding of the expert opinions and individual scores of the panels is a waste of valuable information. The panels are rightly made up of some of the best scientists in the country, and researchers would greatly benefit from knowing their opinions.

The NHMRC might be worried about appeals from a disgruntled few if so much feedback is given, but a simple process that strips away the applicants’ right of appeal would afford them some protection.

A clear comparison can be made with journal peer review. When papers are rejected by journals the editors usually explain why the paper was rejected, and give their feedback and the thoughts of the reviewers that can be used to improve the paper. Although this feedback is sometimes of poor quality, it usually tells the researcher where improvements need to be made before submitting to the next journal.

We understand that the Commissioner for Complaints wrote to the NHMRC to say that the previously given narrative feedback was “misleading”. Giving the maximum amount of information possible could not be “misleading”, as it would be the information used to make the decision.

This system of detailed feedback could be refined, with detailed feedback only to near miss grants. These researchers would be encouraged to explicitly address these comments when re-submitting these proposals. The US National Institutes of Health currently use such a system for re-submissions. They also only allow one resubmission, the aim of which was, “to facilitate funding of high quality applications earlier, with fewer resubmissions”.

We would be happy to discuss these important issues further.

Yours sincerely,

Professor Nicholas Graves, Queensland University of Technology

Associate Professor Adrian Barnett, Queensland University of Technology

Dr Danielle Herbert, Queensland University of Technology

Professor Philip Clarke, University of Melbourne


N Graves, et al. Funding grant proposals for scientific research: retrospective analysis of scores by members of grant review panel. BMJ. Volume 343, April 2011.

D Herbert, et al. Australia: behind the times on preparing grant proposals. Under review.

Page reviewed: 19 February, 2013