Marsden 2013: Big increase in funding lifts success rate

By Shaun Hendy 24/01/2014 5

This post is late, very late! I have a long list of excuses, many of which involve moving to Auckland and writing a Centre of Research Excellence Proposal. But with the 2014 Marsden round almost upon us, it is well past time to look at the numbers from 2013.

2013 saw a big increase in the funds handed out. In fact the $68m awarded was the largest ever*, only surpassed by the 2009 round ($65m) if you adjust for inflation. In real terms, the Marsden fund has handed out about 18% more each year over the period 2008-2013 than it did over the preceding decade. The average funding awarded to each successful proposal (fast-start and standard) continues to hover just below $600k.


If the total investment was high in 2013, while the funding per proposal remained static, then the number of projects that were funded must have risen. This was indeed the case, yet at the same time the number of proposals received by the Royal Society continued to climb. There were a record 1157 first round proposals submitted in 2013, compared to an average of 800 proposals per year over the period 1998-2007. This means that although a record-equalling 109 proposals were funded, the overall success rate of 9.4% remained below its long run average of 10%.


The growth in the proportion of funds awarded to fast-start grants for early career researchers (available to researchers within seven years of completing their PhDs) has continued, but the proportion of funds awarded to fast-start grants is still less than the proportion of applications for fast-start grants: in 2013, 22% of the funds Marsden awarded went to fast-start grants while 28% of applicants wrote fast-start proposals**. Would it be fair perhaps to see the share of funding allocated to fast-starts grow to match the proportion of fast-start applicants?


Fast-start proposals have had a success rate of just below 13% since they were created, slightly higher than that of standard proposals at 9%. Interestingly, the success rates of fast-start and standard applicants are only weakly correlated. As I noted last year, the fast-start scheme now plays an important role in early career development for scientists now that the FRST post-doctoral fellowship scheme and the International Mobility Fund are gone. The Rutherford Discovery Fellowships also contribute to early career development but are relatively few in number.


There was a comment on my 2012 Marsden post that the >1000 proposals rejected annually represented a huge opportunity cost. However the worth of a rejected proposal is not zero. I always tell myself that it is a chance to plan my research several years in advance, and – if you make it through to the second round – it is a chance to get feedback from international experts in the field. Nonetheless the significant growth in rejected proposals that has occurred over the last few years suggests that the opportunity cost of the Marsden Fund may be increasing.

*NB: The figures released by the Marsden Fund in 2013 did not include GST.
** My thanks go to Jason Gush for filling in some holes in my data on fast-starts

5 Responses to “Marsden 2013: Big increase in funding lifts success rate”

  • I was wondering recently if it would be possible to get figures on publications from successful and unsuccessful Marsdens. Then one could investigate whether the Marsden process actually selects for “excellent fundamental research”. This would tell us if the process is relatively arbitrary after the first round and then a lottery system would be a more efficient use of academic time.

  • Hi Paul – Adam Jaffe from Motu currently has a student doing something along the lines you suggest. It is a bit more complicated than it first seems because those who are successful receive additional resources i.e. they might not be the most excellent, but nonetheless they might go on to do better simply because of the extra funding. Adam has a method for dealing with this that uses the panel rankings (put simply, you try to compare those who just missed out to those who just made it on the assumption they were very similar in quality). Hopefully we’ll hear back on the results from this study later this year.

  • Sounds like an exciting project. Do they have access to several years worth of panel rankings? How are they mapping proposals to publications? It sounds rather complicated but potentially very worthwhile.

  • Yes, we should learn a lot I think. I am really pleased that the Marsden Fund is leading the way in rigorous evaluation of its programme. Good on them.

    RSNZ have digitised panel rankings going back as least as far as 2003 and publications are obtained from scopus searches for publications authored by investigators. Obviously it’s difficult to assign individual papers to particular proposals (especially when the proposal in question was never funded!) but by looking at the total output of each investigator, you ought to be able to see if there is a productivity bump from getting funded. You can then look for second order effects in citations and collaborations etc.

Site Meter