CRI bibliometric performance: Part II

By Shaun Hendy 10/02/2010

In a post a few weeks ago, I looked at the total published output of the CRIs from 1993. Now I want to look at the citations to CRI papers. I will use two citation measures. The first is a two year impact factor, which is a measure that is often used to rank journals. The impact factor of a CRI in 2008, for example, is the average number of citations in 2008 for papers published by authors at that CRI in 2006 and 2007. The second measure I will use is a 5-year impact factor i.e.  the average number of citations to papers in 2008 that were published between 2003-2007 is the 2008 5-year impact factor.

Now, the analysis I am going to give below is somewhat naive. I should really be breaking down the citations by subject area (as pointed by Crikey Creek’s Daniel Collins in a comment last year). This is important because rates of citations differ considerably between disciplines – unfortunately I haven’t had the time to do this, except in a few special cases such as my own Institute. Thus, differences in impact factor between Institutes will depend on the areas in which they work. Changes in that difference over time may reflect changes in focus within Institutes, rather than changes in impact of the research conducted.

Why do citation rates differ between disciplines? At least part of the difference comes from the degree of empiricism within a discipline. Medical science frequently makes use of the aggregation of meta-data from many studies, some of which may be too small to have statistical significance on their own. So if your small study suggests that  smoking is a risk factor for diabetes, it will be important to cite as many other studies of smoking and diabetes as possible to give your reader context. Mathematics on the other hand relies on mathematical proof. To prove the Reimann hypothesis, you may only need to cite a handful of papers that contain results you rely on in your proof. You hardly need to cite every paper on the Reimann hypothesis that has appeared in print. Not surprisingly, journals in medical science typically have much higher impact factors that mathematics journals.

CRI Impact vs NZ On to the results. Firstly I have plotted the CRI (2 year) impact factor from 1995 to 2008 (on the right) against the New Zealand impact factor as calculated from the Thompson Reuters database. Firstly, we note that both data series show large increases over this time period. However, in 1995 the CRIs trail New Zealand as a whole, whereas in 2008 the CRIs lead New Zealand. The data is sufficiently noisy that one can’t to assert that the CRIs are significantly different from the rest of the country with much confidence however.

CRI 5yr Impact However, with the 5-year impact factor, the trend seems clearer: the 5-year impact factor of the CRIs is below those of New Zealand as a whole at the end of the 1990s, but by the mid 2000s it surpasses those of the rest of the country. As I mentioned above, there could be a number of explanations for this. CRI citations per paper have grown faster than New Zealand as a while. For example, I wonder if this could reflect a diversification of research activities at Universities, where disciplines with lower impact factors have started publishing more, perhaps as a result of the Performance Based Research Fund.

Unfortunately, without breaking down citations by discipline we can’t really tell whether this does reflect an increase in relative impact by CRI researchers. However, the data does suggest that this would be a worthwhile exercise: why has CRI impact surpassed that of the rest of New Zealand in the last decade?

CRI bibliometric performance: Part III

0 Responses to “CRI bibliometric performance: Part II”

  • The Australian Research Council have recently completed a large exercise where they ranked every academic journal less than two years old into four categories (A*, A, B and C) for the purposes of performance based research funding. Along the way they were forced to deal with exactly the problem you describe with different citation rates in different disciplines, and even between sub-disciplines. I believe this addressed by moderating the results of the bibliometrics.

    In the case of mathematics this involved the Australian Maths Society and academics in departments around the country giving feedback on a proposed list of rankings. At one point it was even suggested that applied maths journals be ranked by impact factor and/or citation rates, with pure maths journals ranked by a peer-review process due to the (sometimes) very low citation rates.

    I’m not sure how the final rankings were determined though, given the effect of the ranking on university (and individual*) funding, I’m sure there was some lobbying from certain areas to make sure that the journals they publish in most frequently were highly ranked.

    (*The institution I am at gives $1000 directly to researchers for each paper they publish in an A* ranked journal.)

    The full (~5MB) list of journal rankings is available at

  • Actually, I am aware that the ARC journal ranking system is already being used in New Zealand as a proxy for impact. I suspect we will become very familiar with this system over here in the seventh state.

  • Leave your comment here…
    And another interesting question is why the citation rates as a whole are increasing so dramatically. Is NZ typical in this respect?

  • Hi Mike, yes, this is a good question – probably worth a follow up post sometime. Yes, world impact factors are increasing; without having looked at this closely I think NZ is doing somewhat but not dramatically better than average.

  • Further to my earlier comment, aggregate citation rates must be function of the total number of new papers being published, and the average number of other papers cited by each new paper. Presumably the former is rising strongly. The latter depends, as you have said, on the mix of disciplines. Maybe multidisciplinary papers tend to cite a lot of papers, too.
    Anyway, to keep pace, a person or organisation researching in a discipline would want their citation rates to at least match the growth in the number of papers being produced in their discipline.
    For what it’s worth, the CRIs vary a lot in their production of papers. The ratio of highest to lowest is more than 4 to 1.

  • Mike – yes, I agree. A very nice analysis has been done for the Universities by Warren Smart at the Ministry of Education (available at Warren broke citations down by subject area and found that citation rates in NZ Universities were growing faster than in the Group of Eight Australian Universities. As far as I am aware nothing like this has been done for the CRIs.

    Another interesting observation I have made in the CRI data is that the outputs of some individual CRIs have varied by as much as a factor of two over the last decade.