The project-based funding model in research: what impact on contemporary research? 2/2

PART 2: WHY WE SHOULD BE HAPPY WITH THE DEVELOPMENT OF COMPETITIVE MODELS IN RESEARCH FUNDING AND HOW WE COULD MAKE THE MOST OF IT

After detailing the criticism that may (sometimes rightly) be directed at a funding model in which research is funded on a competitive basis after a process of selective project evaluation, I will now examine how this model can be ‘saved’, covering its potential advantages, how to neutralise its potential damaging effects and under what conditions it might in fact turn out to be more efficient and challenging than other models based on recurrent endowments.

In particular, I would like to introduce the funding model proposed in the RAND Report 2013 entitled Alternatives to Peer Review in Research Project Funding, which does not renounce the idea of funding projects on a peer-review basis, but proposes alternative ways to guarantee efficiency and relevance for this mode of funding. To put it another way, one sticks to the idea of competitive funding processes for research projects based on open calls for projects and peer-review selection methods, but the idea is to recreate the methodology of this process in its various steps in order to suppress potential biases and negative effects while conserving the advantages of this kind of procedure. Instead of just launching ‘blank’ calls, the point would be here to stimulate original and maybe unexpected research proposals and intuitions. Instead of applying formatted and conservative selection standards, the point would be to encourage collective reflection on the various qualities a project may present to be accepted. Instead of discouraging rejected projects, the purpose is to encourage the recreation of rejected (and even of accepted) projects by providing clues for their refinement. Instead of limiting research to short-term, ad hoc investigations, the broad ambition of this model would be to foster long-term but rationalised research projects, adequately ‘sliced’ into relevant phases, steps and milestones.

In concrete terms, the pattern suggested by this report would propose ‘a three-phase model of the science funding process: a development phase for research questions and proposals; a selection phase where the most promising proposals are selected; and finally a research phase in which the research is carried out’ 1. This model is represented in the following figure, which shows how to refine the three traditional phases of development, selection and implementation:

Figura 1

Now when we examine the details of this proposed model, we notice that it actually seems to answer the major objections that have been formulated towards competitive funding research models.

– Encouragement of creativity: This aspect is restored at the early stage of the development phase, i.e. at the early stage of the definition of the research axes. One switches here from a model where ‘the funding organisation decides on a broad set of research priorities before allocating a fixed amount of resources to each programme area’ to a model in which ‘the funder breaks up big challenges, such as developing a cure for a disease, into a road map of smaller questions. Researchers then compete to answer these smaller questions’. This, in turn, ‘allows a strategic, stepwise approach, and easier judgement of whether particular research aligns with the funder’s aims’. This may also help the scientist to consider specific and original issues with concrete impact, rather than aiming at general and broadly scientific programmes excessively broad in definition.

To counterbalance the limitation of projects to mainstream or conservative research, this system also guarantees that ‘funds are allocated with a balanced risk approach, rather than by topic. Riskier projects with uncertain outcomes, but which may be groundbreaking, are more likely to be funded in this way because funders can balance them with awards to conservative projects with predictable outcomes’.

Another objection to the traditional model was that it may be incompatible with the development of multidisciplinary research. This objection is matched by the composition of committees within this new approach, where ‘decision-makers, community members and academic reviewers from cross-cutting disciplines can add perspective beyond the pure scientific merit of a proposed research project’.

– Dynamism: This new scheme would also encourage a pro-active attitude from research teams, since they are no longer passive in the expectation of calls for project launches. On the contrary, ‘the funding body goes out and “beats the bushes” to generate interest and proposals, interacting with researchers and teams to ask them to submit applications’.

In addition, this system limits the effect of discouragement linked to project rejection since the applicant is constantly encouraged to refine and resubmit his/her project. Conversely, funded projects are not merely accepted as such, but are also encouraged to be reshaped in order to better match their goals and issues. This happens at the ‘refining stage’ of the model, where ‘money is awarded, but is conditional on changes and developments suggested by the funding body. The funder works with the research team to improve the proposed work programme before the start of the project’.

– Transparency: Transparency and objectivity of selection criteria are supposed to be guaranteed in this model, since the selection phase is committed to the application of objective and calculable criteria rather than qualitative feedback. In concrete terms: ‘Funding is awarded transparently on the basis of the scores given by reviewers rather than via a group decision’. The process is therefore open and understandable, which limits the discouragement of rejected researchers, first because they have more guarantees of the fairness of the process and second because they have better insights into the ways to improve their project.

– Compatibility with long-term research: Lastly, one major reproach to the competitive model of project-based research funding was the difficulty to plan long-term research projects within this model. This issue is clearly addressed by the designers of this funding-model, who remark that:

Major scientific breakthroughs that require a long time to achieve, and that naturally build upon an evolving body of smaller discoveries, are not well supported by ambitious, ex-ante requirements for lofty outcomes. The size and scale of these undertakings – curing cancer or HIV, for example – can be too daunting for any single research group or organisation to undertake.

But the authors also assume that this proposed pattern will not only make long-term projects possible on the basis of a succession of decisive steps, but will even condition the very possibility of them. As a matter of fact, long-term projects must proceed by setting up regular and decisive milestones. This is why they defend ‘milestone-based funding strategies’ that ‘are characterised by intermediate products that aim to make progress towards a specific, pre-defined goal, rather than trying to achieve the goal outright’. This milestone-based strategy is precisely at the heart of their funding model, in which researchers are encourage to compete on smaller issues. In this respect, the point is not only that project-based funding models are compatible with the development of long-term research. In fact, they might even appear as a condition of possibility for such long-term research by leading researchers to earmark the path to follow in order to achieve their scientific goals.

This being examined, let us mention another advantage of (smartly-shaped) competitive research funding models, namely that they create more opportunities for researchers or research teams to get funding for their research. This is also part of the boosting effect induced by this model: whereas recurrent endowments are permanent, regular calls for projects imply a diversified funding strategy, with the idea that failure is never final and assuming that the applicant may regularly resubmit his project and/or apply to alternative grants offered by other institutions.

Of course, we have noted that this peer-review system was being criticised for being a burden both to researchers and for reviewers. On the other hand, a stance that would tend to reduce the frequency of calls may also be viewed as an impoverishment of opportunities for researchers. And as a matter of fact, this is precisely what happened in 2012, when over 550 US biologists sent an open letter to the US National Science Foundation (NSF) 2in order to denounce its new funding policies, switching from a system where researchers had two chances to apply for funding each year to a system where they have just one opportunity to apply, in two stages 3. The aim of this reform as described by the NSF was to simplify the process of project review, which was becoming overwhelming. And yet, the scientists concerned with this revision of funding processes gave the following diagnosis:

We recognise that increasing proposal submissions and declining funding rates are creating undue burdens on programme directors, investigators and the community of reviewers. Nevertheless, we feel that the new pre-proposal process is slowing the pace of science at a time when society’s need is increasing for timely and sound science to inform solutions to tough environmental problems. Moreover, the new process does not ensure that the best science is funded with the limited funds that are available.

In other words, these scientists claim to be ready to assume the cost of applications and reviewing processes, for they see this as a condition to have regular opportunities to develop their projects, while guaranteeing the quality of funding and stimulating the pace of scientific research.

One may finish this study by proposing an empirical example of well-founded funding policies, applying the good practice we have just described and providing efficient results. This example can be found with the Howard Hughes Medical Institute (HHMI) 4, which is detailed by Pierre Azoulay, Joshua S. Graff Zivin and Gustavo Manso in their paper on ‘Incentives and creativity’ 5. This institute has in fact developed a funding programme that distributes 700 million USD per year relying on the following principles: ‘the award cycles are long (five years, and typically renewed at least once); the review process provides detailed, high-quality feedback to the researcher; and the program selects “people, not projects”, which allows (and in fact encourages) the quick reallocation of resources to new approaches when the initial ones are not fruitful’. In other words, this funding policy exploits the positive effect of competitive funding (challenging effect, encouragement of a quick research rhythm), but also tries to neutralise its negative effects such as conservatism, opacity of criteria and discouragement of long-term research. Now, taking a look at the effective results of this incentive-based approach to competitive funding (in terms of bibliometrical indices or publication results), one finds that this policy does result in the effective development of bold, vivid and exploratory research:

We find that selection into the HHMI investigator programme, which rewards long-term success, encourages intellectual experimentation and provides rich feedback to its appointees, leads to higher levels of breakthrough innovation, compared with NIH (National Institutes of Health) funding which is characterised by short grant cycles, pre-defined deliverables and unforgiving renewal policies.

In other words, the competitive research funding models adopted by research agencies may of course have various flaws and weaknesses. However, they may also be shaped in a strategic and alternative manner that turns them into a genuine factor of dynamism for research and innovation, both on a short-term and on a long-term basis.

**

*Note: This article gives the views of the author, and not the position of SIRIS Lab, nor of SIRIS Academic. Please review our Comments Policy if you have any concerns on posting a comment below.

About the author, Sabine Plaud: I have a training in philosophy with a French PhD on the philosophy of Ludwig Wittgenstein. My theoretical concerns regard philosophy of language and in particular, the link between pictures and discourses, with a focus on the notion of modelization. In my activities at SIRIS Academic, I am thrilled by issues regarding research strategies and management of teaching and education policies.

Information about Featured Image: “Crowdfunding” by Rocío Lara, 18/04/2013. Available here.

Notes:

  1. Susan Guthrie, Benoît Guérin, Helen Wu, Sharif Ismail, and Steven Wooding: Alternatives to Peer Review in Research Project Funding, RAND Report April 2013, Prepared for the UK Department of Health-funded Centre for Policy Research in Science and Medicine (PRiSM):http://www.rand.org/pdfrd/pubs/research_reports/RR139.html
  2. ‘An Open Letter from the Ecological and Environmental Sciences Community Regarding the New Pre-proposal Process in the Directorate of Biological Science’, August 2012:
    http://www.cbs.umn.edu/sites/default/files/public/downloads/EcologistsLetter-NSF.pdf.
  3. See ‘US biologists decry funding changes’ in Nature 490, 159 (11 October 2012):
    http://www.nature.com/news/us-biologists-decry-funding-changes-1.11562.
  4. https://www.hhmi.org/.
  5. Pierre Azoulay, Joshua S. Graff Zivin, Gustavo Manso: ‘Incentives and creativity: evidence from the academic life sciences’ in The RAND Journal of Economics 42, 2010, p. 527–554.

Leave a Comment