(Science|Business) Distributed peer review cuts funding decision time, trials show. But there are risks
- Juliette Portala
- Sep 1
- 4 min read
Updated: Nov 10
For the original publication, please click here.
Asking applicants for research funding to decide who will receive grants slashes review times by more than a half, according to a new study from the Research on Research Institute at University College London and the Volkswagen Foundation. But with fierce competition for funding, safeguards are needed to counter attempts to game the system.
Research funding schemes are traditionally assessed by external expert reviewers, appointed by the funder. This alternative, known as distributed peer review (DPR), requires applicants for a funding call to also act as assessors for the other applications.
Originally designed to allocate research time on astronomical telescopes more efficiently, this approach is now gaining ground among major funders, such as Germany’s Volkswagen Foundation, the Dutch Research Council (NWO), UK Research and Innovation (UKRI), plus the National Science Foundation and National Institute of Food and Agriculture in the US.
The advantage of DPR is that it immediately generates a large pool of reviewers for each call, people who are not only well-versed in the field but also already familiar with its specific demands and invested in the process. There is no need for the funder to assemble and run external panels, and no lag while these ‘experts’ get up to speed. This saves both time and resources.
The new study found that applicants involved in DPR spent 60% less time on average reviewing proposals than external experts.
“Speeding reviewing up will allow funders to better manage the funds they have for research,” said Melanie Benson Marshall of the University of Sheffield, one of the primary authors of the paper. “Faster grant funding times will mean lower costs and faster research outputs, and that has to be good.”
And while this may seem like the administrative burden is being moved from the funders to the applicants, the study suggests that this additional commitment had not deterred researchers from applying.
“It is true that DPR adds reviewing time to application time, but we believe that the reviewing time is relatively modest compared to the overall time it takes to prepare an application,” said Benson Marshall, adding that funders could also shorten applications in order to make up for this extra time reviewing.
A further advantage of DPR is that it raises the number of reviewers per application. This implies that applicants will likely receive more feedback, or feedback from more diverse sources.
Funders, researchers rewarded
NWO is one of the funders that has found benefits from experimenting with DPR.
“Our Open Competition XS calls usually take about two months from deadline to decision on granting,” said Jennifer Bendsneijder, a spokesperson for the organisation.
Within this process, applicants are asked to rank ten proposals, in each case addressing the two assessment criteria with short statements of 15-60 words each. “The time the assessors [i.e. other applicants] spent on the assessment of ten proposals varies greatly, but for most this did not take longer than one day,” Bendsneijder added.
Elsewhere in Europe, UKRI began to implement DPR as part of a wider programme to streamline and vary its assessment processes for grant applications.
“It allows UKRI to channel funding more nimbly to time-sensitive research opportunities or respond to emerging research areas or trends,” a spokesperson explained. “It reduces the risk for applicants that delays in funding decisions disrupt the momentum of a project or area of work and makes it easier for researchers to act on moments of inspiration or discovery; and it offers applicants more certainty, which may be critical for career progress and job security.”
Applicants are just as positive about DPR. For example, submissions to a Volkswagen Foundation call in the humanities and cultural studies rose by 18% between the first and the second trial of DPR. Meanwhile, 83% of funded and 60% of unfunded applicants said that they would likely participate in future funding calls using DPR.
“While applicants don’t receive rewards per se, they might benefit from DPR in other ways: receiving a faster decision, getting more feedback on their own proposal from multiple perspectives and gaining experience in reviewing, which might in turn improve grant writing,” said Anna Butters of the University of Sheffield, another author on the study.
Meanwhile, UKRI found that 88% of applicants involved in its DPR trial believed that taking part in the process had improved their application writing skills, while 84% said that it had increased their knowledge.
Assessment safeguards
But using DPR requires some caution. According to Butters, there are concerns about the possibility that applicants might try to game the system, for example by giving harsh reviews in order to increase their chances of being selected for funding,
“Our analysis did audit reviews for this behaviour and we couldn’t find any evidence of it occurring,” she said. “It’s definitely something any funder running such a scheme should worry about, and introduce mechanisms that discourage such behaviour in the first place, as well as audit for once reviews are in.”
For its trial, the Volkswagen Foundation trimmed the lowest and highest scores received by each proposal to prevent it from being especially favoured or disadvantaged, she noted. Other solutions to counter this risk include “consensus scoring,” which provides a bonus to applicants if their score matches others, or the creation of two sub-pools.
“Reviewers only assess applications from the pool which does not contain their own application. This means that the reviews provided cannot impact on the likelihood of the reviewer’s own application receiving funding,” Butters said. “Such an approach is most suitable where there is a fairly homogeneous group of applications.”
UKRI has already built similar safeguards to protect researchers against influencing their own outcome.
For instance, the organisation randomises applicants into two reviewer pools and divides the budget into two ringfenced pots. “We fund an equal number of projects from each pot. As a result, no one is ever reviewing a proposal they are directly competing against,” its spokesperson said. “Applicants are aware that there is nothing to be gained from harsher marking.”
DPR is not suitable for all kinds of grant, however. The study says that, in exceptional circumstances, applicants will not have the expertise to carry out reviews, which means that funders will need to bring in additional, non-applicant reviewers.