Spin Doctors go to work on PBRF

By John Pickering 12/04/2013 11

University of Taihape:  Doctor Doctor I’ve got a 1.7 on my PBRF

Doctor Spin: Never mind son, your Gumbootlogy results make you the healthiest tertiary education provider in the country.  Let’s talk about that, shall we?

Scoop.co.nz has all the spin from the universities and polytechnics this morning as they try and give the impression that they are the best.  At times like this I am ashamed to be an academic.  One of the worst of sins is to cherry pick data to make your self look good.  We are used to this from certain sectors of society, but we should expect better from our educational institutions. Unfortunately, the culture of self promotion above all else has taken hold in our hallowed halls.

For those who are unaware of what I am talking about, around 18 months or so ago all academics in the country had to put forward a “portfolio” to demonstrate just how research active they were.  This is the Performance Based Research Fund (PBRF) exercise held every 6 or so years. Groups of academics under government oversight then went about scoring each academic in a process that has taken 15 months.  The result is that every academic has been given a grade A , B, C or not research active.  The grades of academics in each institution are then thrown into four different formula – each has additional information about different aspects of the institution (eg numbers of postgrad students).  Four numbers result.  This gives Doctor Spin plenty to play with. The numbers are also what are used to allocate hundreds of millions of dollars of research funds – here in lies the importance of PBRF to the institutions. A number is also provided for each of the self selected academic units that the institutions provided to the Tertiary Eduction Commission.  If they don’t score well in any of the four overall grades (comparative to other institutions their own size), then they can pick a favourable number from one of their academic units and talk about that. More grist for the spin mill.

Academics are notoriously competitive – obviously a good trait when it drives them to produce better and better research. I certainly have some of that streak in me. However, it is not  helpful when it results in attempting to pull the wool over the eyes of the public as happened yesterday.  The PBRF is a complex system designed to find a way to allocate research funds and hopefully improve research quality.  Academics will argue until the cows come home if it does this fairly. It certainly is a very expensive exercise. It certainly focusses institutions on the importance of research, which is a good thing.  Remember, the teaching in our universities (not polytechnics) is required by law to derive from research.  However, as a small country where the combined size of all our universities is only that of some of the larger overseas universities I wonder if such a inter – institution competitive model is the best for the country?  Perhaps the story should be an evaluation of cost benefits of the exercise. Is this the best method of allocating funds? Such a story should also consider if the competition is enhancing or detracting from the quality of research – after all in almost any field experts are spread across institutions.  Collegiality is a major driver of good research – does PBRF hinder that?


If you want to check out the PBRF results in detail your self you can download a PDF from the Tertiary Education Commission here.

Disclaimer:  If you think my skepticism about PBRF is sour grapes because of a “poor” grade, then you’d be wrong.

Tagged: PBRF, Performance Based Research Fund, Scoop, TEC, TEO, Tertiary Education Commission, Tertiary Education Providers, University, University of Taihape

11 Responses to “Spin Doctors go to work on PBRF”

  • John,
    Also any degrees taught a polytechnics should be taught by a majoritymofmresearch active staff.

  • Well said, John.
    The PBR Fund is $250M rising to $300M by 2016. The money isn’t what drives the universities, because the distribution of it doesn’t change very much after each round. This is because there are several variables in the calculation and they offset each other.
    It’s the reputational impact which is important, and this is where the spin doctors start to earn their money. The universities imagine that their PBRF rankings influence student enrolments. This may or may not be true. The rankings also confer bragging rights on the vice-chancellors, which is supremely important..
    As to the 2012 results, they reveal another feature of the PBRF- it is very vulnerable to gaming. In Table A8 of the report it shows that VUW has achieved the highest average quality score (by some margin) in ‘Biomedical’, despite it being the first time they have entered this subject area, and despite VUW not having a medical school. They only submitted 8 portfolios against Auckland’s 129 and Otago’s 137. This is another example of cherry-picking.
    Table A5 has Auckland as the top Ag Science department, which defies belief. They have 0.2 of an A grade researcher (probably seconded in from a CRI) against Massey’s 14.
    Averaging the quality scores has produced nonsensical results.

    In my view the PBRF is deeply flawed; it is expensive to run, encourages perverse behaviours, and leads to unreliable comparisons.

  • The PBRF is only really useful for allotting funding to the different tertiary organisations based on the outputs of their researchers.
    As both John and kemo sabe point out quite clearly any attempt to use it to garner prestige is laughable in its absurdity, which anyone who even vaguely understands the system should realize.
    Anyone who “hides” C and even B graded researchers in order to garner greater prestige by having a higher proportion of A grade researchers is depriving their institution of the funding these researchers will bring in.
    Seems stupid to me.

    And I doubt students and their parents will be swayed by this information. In my experience students tend to pick institutions based on the programmes available, their location and sometimes on the party life 🙂
    And many postgraduate students target specific lecturers who share their interest.

    The only one’s who care are the the senior management as they play these silly games over who is the “top”, “premier” or “best” institution (and perhaps a few gullible politicians)
    The same thing happens after each Marsden round as different universities brag they got the most funding, the most funding proportional to their size, or the most money in different categories. All seems a bit daft to me

  • Table A-9 has the University of Auckland at the bottom of the table which certainly does not reflect the capability and the talent at the university. They still have the greater number of A’s and the large number of C’s is no doubt due to all the postdoctoral students, who by virtue of being a the beginning of their careers, are unlikely to score anything higher than a C.

    If PBRF data is used to rank universities then I think kemo sabe has it spot on is suggesting that
    “the PBRF is deeply flawed; it is expensive to run, encourages perverse behaviours, and leads to unreliable comparisons.”

  • TEC should have anticipated that TEPs would try to manipulate their scores. Blocking R and R(NE) results from the calculation (to try to avoid gaming) blunts the PBRF as a means to encourage Universities to support and improve research outputs of staff who teach degree courses, and means that Universities will be more likely to differentiate us into “research only” or “teaching only” staff. This in turn will undermine the credibility of degrees and blur the distinction between Universities and Polytechs.

  • Well, over a week a later, in today’s paper we get University claims topmost rank, leading with

    “The University of Otago is still ”clearly New Zealand’s top-ranked university for research”, once all assessment measures used in the latest Performance-Based Research Fund evaluation are considered, Otago officials say.”

    They also seem be to in a bit of stand-off with Victoria over the Law School results,

    “Otago Dean Mark Henaghan said Victoria University’s claim to be the top law faculty for research quality was not true and ”we are clearly well out in front”.”

  • hi I had to leave my school in 2011 (they closed it or transferred it, whatever you want to call it, in order to inflate their EFTS at the main HQ) and they shut down all PBRF records to “improve them” so I never got to add to it or even see it. How do I know what I received? I left the country for another job so it would have been, ah, academic, but interesting.