University Rankings by PBRF Score [1]

 

 

Keywords: Education; Statistics;

 

The PBRF (Performance Based Research Fund) score card has modified university behaviour substantially. This is not a paper about how that internal behaviour has changed. Rather it suggests that the scores may be used in different ways to draw quite different conclusions. Gilling’s law states that the way you score the game shapes the way the game is played. The PBRF scores seem to be shaping the way the universities are playing their games.

 

The origin ogf the PBRF arose from an unfortunate policy arising from a conflation between education and training, recommended by the 1988 Report on Post Compulsory Education and Training in New Zealand (The Hawke Report). That meant, in principle, tertiary training institutions would be funded at the same level as universities. Inevitably means were sought to get around the absurdity of the equation of education with training. One such means was to financially reward tertiary institutions (predominantly universities) for the research they did. The PBRF exercise was to measure the amount of by staff’s publication records.

 

It involves each university submitting a list of academics and their research records, who are then graded and scored. The scores are aggregated to give a measure of average ‘quality’ which is reported by subject groups within reach universities and for each university.

 

The Tertiary Education Commission (TEC) then reports the scores by ranking the universities by their average quality score. There is no need to; they could report them by alphabetic order, for instance. Or they could rank them by some of the other measures we shall report below.

 

One observes that while the financial rewarding continues, the PBRF exercise has taken a life of its own by ranking the universities by their PBRF scores. The universities compete fiercely to have a high average quality score. There are numerous reports of the manipulation of the PBRF measurement system. This note is not about that, although an appendix describes some of the alleged manipulations; it assumes that the exercise is broadly honest (or at least the universities are equally dishonest). Its point is to show there are still various ways to rank the universities.

 

The TEC ranking is as follows:

 

Average Quality Research Score of Submitted Staff

Victoria University of Wellington      (VUW)            5.51

University of Auckland                      (AU)                5.12

University of Otago                            (OU)                4.96

University of Canterbury                    (CU)                4.79

University of Waikato                         (WU)               4.53

Massey University                              (MU)               4.31

Lincoln University                              (LU)                4.02

Auckland University of Technology  (AUT)             3.59

 

I’ve done a simple check on the ranking, comparing the proportion entered for the PBRF exercise score with the entire staff. I could find data only for the 2011 (a year earlier than the PBRF census). [2] Here are the resulting rankings. [3]

 

Proportion of All Staff (EFTS) submitted for PBRF Assessment

UC                  34.5%

AU                  32.5%

VUW              32.5%

OU                  31.1%

MU                  30.9%

WU                 28.6%

LU                   25.6%

AUT                21.9%

 

The differences in ratios are rather surprising.

 

FIREPOWER

 

For some purposes what matters is the economies of agglomeration, the total number of staff on the campus who are doing significant research. The total quality on campus may matter rather than the average quality. Here are the rankings for aggregate PBRF (here called ‘firepower’).

 

(Observe that the original purpose of the PBRF exercise – to reward universities for their research activities – was to calculate the firepower, not the so-called ‘research quality rankings’.)

 

Total Quality Score (Total PBRFs)

AU                  7964

OU                  5795

MU                  3959

VUW              3535

CU                  2956

WU                 1996

AUT                1542

LU                     700

 

There are a couple of problems with this measure. First, the university may be multi-campus and it may be that the amount of firepower on each campus matters more than the total across all campuses. However, it is not easy to calculate the numbers of campuses each university has. [4]

 

Moreover what may matter for economies of agglomeration is a cluster of subjects (such as agriculture, arts, medicine or science) rather than the firepower of the whole university.

 

In any case this measure favours large universities. What about student access to significant researchers?

 

FIREPOWER PER STUDENT

 

Do students benefit from the PBRF-rated staff? We cannot answer that (there are excellent teachers who are of great benefit to students even if their PBRF grading is negligible). But we can assume that it may be true and calculate firepower per student (actually per 100 EFTS based on 2011 numbers).

 

Total Quality Score per 100 EFTS

OU                  32.8

AU                  27.6

LU                   25.2

CU                  23.6

MU                  23.6

WU                 23.0

VUW              22.7

AUT                10.2

 

A rather different ranking from the previous ones.

 

FIREPOWER PER DOLLAR

 

How much does the government get for its spending in each university?

 

PBRFs per $million Total Government Payments 2011

VUW              23.0

OU                  21.8

MU                  21.7

AU                  21.6

WU                 20.5

CU                  19.4

LU                   17.0

AUT                10.9

 

The rankings change again.

 

How much does the country as a whole get for the total revenue of each university, which includes student fees, research grants, business contributions, donations, interest income and the like?

 

PBRFs per $million Total Revenue 2011

VUW              10.5

UC                  10.2

OU                  9.8

MU                  9.1

WU                 8.9

AU                  8.5

LU                   6.1

AUT                5.1

 

This table might be thought of as measuring gross productivity of the university, except that it ignores teaching as an output. To get a rough idea of net productivity we deduct the average cost of an EFTS in the Institutes of Technology and Polytechnics (ITP) sector for each university sector ($14,800 per EFTS) from their government and fee revenue and divide it by the EFTS.

 

In effect we are assuming that were the universities without other (research) responsibilities their production costa would average the same as ITPs. However they receive more cash because of those other responsibilities thereby producing non-teaching outputs which are proxied by the PBRFs.

 

This gives a kind of net productivity (above teaching) measure and ranking.

 

PBRFs per $million Non-Teaching Revenue 2011

AUT                58.6     (1)

VUW              40.6     (2)

CU                  32.7     (3)

WU                 27.9     (4)

MU                  27.0     (5)

OU                  21.3     (6)

AU                  17.4     (7)

LU                   12.1     (8)

 

Arguably the university courses are more expensive than the ITPs ones (e.g. more postgraduates courses and also exceptionally expensive courses such as medicine). The third column shows the ranking if each university courses are 10 percent more expensive that ITP ones. They prove to be the same.

 

Conclusion

 

No, am I not going to rant against the PBRF which seems to have been designed by those with no understanding of economists’ thinking about incentives in tertiary education. There is at least one 50-year-old paper about this, but it would be too much to expect the designers to be only half a century behind the frontier. (No doubt they were given As for their PBRFs.)

 

Nor am I going to say any more than is in the appendix about the manipulations that universities seem to have got up to in order to enhance their scores.

 

I am certainly not going to say which of the above measures is best. The point of this paper is that you can choose different but reasonable criteria and get quite different rankings. Anybody who clings to a particular (and no doubt favourable) ranking over any other is – well – deserving of a high, if dishonestly earned, PBRF score.

 

Endnotes

[1] Economic and Social Trust On New Zealand, www.eastonbh.ac.nz In research it is usual to declare potential conflicts of interest. The author holds honorary positions in six of the universities and degrees from a seventh. I mention that I was also on one of the minor PBRF assessment panels in 2012. I am grateful for a number of academics who commented on an earlier version.

[2] The main source of data is the Tertiary Education Performance Report 2011.

[3] Note that the total jobs does not include outsourced jobs; practices may differ from university to university.

[4] Two campuses may be close to one another in the same city or they may be sited in different of cities.

[5] It has not been possible to allow for differences in course mix by university.

 

Appendix

 

The following ‘manipulations’ of the PBRF assessment system have been mentioned. The examples come from different universities.

– refusing to accept a resignation from a high research performing academic until after the census date, instead classifying them as on unpaid leave;

– accelerating the appointment of a high research performing academic, so they squeak in before the census date;

– laying off low research performing staff even though they are excellent teachers;

– temporarily laying off low research performing staff so they are not on the books at the census date but performing teaching duties before and after it;

– coaching staff to increase their apparent research performance.

– omitting very low scoring staff to boost the average of the remainder. [A1]

 

One might argue that in some cases parallel behaviour by a student would be judged ‘cheating’.

 

[A1] There are a number of discussions on the net about the ‘rorting’ including:

http://www.kiwiblog.co.nz/tag/pbrf

http://offsettingbehaviour.blogspot.co.nz/2012/04/pbrf-games.html?utm_source=twitterfeed&utm_medium=twitter

 

PS The 2013 World University Rankings from Quacquarelli Symonds gave the following ranking of New Zealand universities (In brackets is each’s world rankings.).

AU (94)

OU (155)

CU (238)

VUW (265=)

MU (343=)

UW (401-410)

AUT (471-480)

LU (418-490)

The survey ranks over 800 institutions (and looks at over 2000) So New Zealand universities generally appear to be in the top half.