In a post last year, I looked at New Zealand’s bibliometric productivity in the university and government research sectors using data from the SCImago bibliometric site. Over the next few weeks, I will report on some further bibliometric analyses using the Thompson Reuters Web of Science. While providing substantially the same New Zealand-wide results as SCImago, the Web of Science database also allows me to break down publication data by research institute (and by individual author if needed). Unfortunately, it is not freely accessible – I have institutional access through Victoria University of Wellington.
I’ll start this series of posts by looking at the total published outputs of the Crown Research Institutes (CRIs). The CRIs were established in 1992 by scientists from the Department of Scientific and Industrial Research (DSIR), the research division of the then Ministry of Agriculture and Fisheries, and the New Zealand Forestry Service. Shortly thereafter, a significant portion of the Crown funding for science became contestable through the Public Good Science Fund, open to the CRIs, universities, and businesses or other organisations conducting research and development.
At the time, the restructuring of government science into the CRIs was highly controversial. ‘Is New Zealand shooting itself in the brain?’ wrote New Scientist magazine. ‘A small country does something like this at its peril’ said John Stocker, chief executive of Australia’s main research organisation, the CSIRO*. The DSIR had given the world Marlborough Sauvignon Blanc, earthquake resistant lead-rubber bearings for building foundations and high-temperature superconductors, yet the Government of the day thought that the new Institutes would be better placed to contribute to New Zealand’s economic growth.
Almost two decades later, our new Government is wondering how the experiment went. While the government scrutinises CRI balance sheets closely, other aspects of CRI performance receive very little attention. This is surprising, since the reason the crown owns such research institutes has nothing to do with their balance sheets at all.
Here I will look at how the CRIs have performed bibliometrically, starting with their total published output from the year following their establishment. The figure on the left shows the number of papers in the Web of Science database published by scientists at the CRIs since 1993. It can be seen that the annual number of publications doubled from 600 in 1993 to 1200 in 1997, a level where it has remained to the present. The increase in output from 1993 to 1997 was substantial, but how was it achieved?
The next figure shows the total researcher FTEs in the CRI sector from 1994-2006, and the corresponding productivity (in papers per FTE) over the same time period. Researcher FTEs increased from 1996 to 2002, but have then declined by 20% since their peak in 2002. Note that the productivity of researchers, in papers per FTE, remains relatively static over the period in question. This largely reflects the New Zealand situation as a whole, where productivity has remained steady, and changes in levels of published outputs have been driven by changes in FTEs.
(Update: I received some better FTE data from Statistics NZ so the figure above was replaced on 18 March 2010. The data is similar to that shown in the original figure, but with the addition of the 2008 data, we see there was large jump in researcher FTE from 2006 to 2008, reversing the decline since 2002).
In my next post on the CRIs, I will look at how the number of citations of their papers have changed over time. I will then look at how the CRIs have been able to lift their researcher FTEs from 1300 in 1994 to over 1800 in 2006. After that I will move on to the Universities.
* Australia still has the CSIRO, and although some reforms have taken place since John Stocker made his comments, Australia has resisted introducing the direct competition between CSIRO and university scientists that has characterised our science system.