iitypeii

iitypeii is the work of two current affair nitpickers ([C] and [M]) and guests ([G]) giving their perspective and insight on publicly available statistics, the connections (and lack thereof) between them, and the erroneous conclusions made...

(In-)Complete University Guide... \[C+M]

The 2014 sci-fi movie Interstellar has the following fantastic exchange between the lead character (Cooper) and the principal at his son's school:

Cooper: What's your waistline? 32? With, what, a 33 inseam? Principal: I'm not sure I see what you're getting at. Cooper: You're telling me it takes two numbers to measure your own ass but only one to measure my son's future?

Now, measuring the quality of a university is arguably even more complex, but the dialogue in Interstellar perfectly encapsulates the futile endeavour a number of newspapers and other institutions pursue on an annual basis - finding a small number of objective metrics which can be combined in order to "rank" universities. 

In our initial University Rankings blog post we pointed out that finding good metrics to measure (some notion) of "quality" in universities is important. What these should be is debatable, but as a university is by definition

a high-level educational institution in which students study for degrees and academic research is done

we would hope that at least one metric should be measuring academic research...

So, how on Earth do we measure research quality in universities?

The Guardian University Guide (GUG) has a very elegant solution - just don't bother.

The Complete University Guide (CUG)  incorporates some research metrics based on sensible publicly available data (!) - unfortunately they don't manage to apply their own methodology properly! As in our earlier post we again only consider "Mathematics" rankings - but the same analysis can be applied to all subjects...

So, what is this sensible data? The UK government conducts a periodic assessment of the research quality of UK universities - the most recent being REF 2014, in which research is broken into 5 ranks (4*->0*). The CUG claims to use this data directly to compile their ranking (computing the average rank). Similarly, the CUG 2015 research ranking was (claimed to be) conducted using the most recently available UK research assessment (RAE 2008, although the categorisation of Mathematics varies slightly between the UK research assessments).

The problem is that upon looking through the CUG table we surprisingly found (among others) Nottingham Trent University (scoring 2.6/4 for research) - but this department doesn't even exist! In fact,  Nottingham Trent closed its maths department and then demolished the building around 10 years ago.

So how did Nottingham Trent get a score of 2.6/4 for Mathematics research? Well, the CUG methodology is actually to update the research score using the UK government research exercise if it appears in the exercise. So Nottingham Trent's score is actually its score from 2001! 15 years out of date! 

Nottingham Trent should have a score of 0/4 for research if the CUG were to apply their own methodology properly. So, we wanted to correct the CUG ranking and see if it differed substantially from the published version. The following table gives our results for the CUG2016 research rankings. The first two columns are the actual CUG2016 research score and ranking, the next two columns are the real scores and rankings and the final column indicates how many places up / down in the ranking the university should be. Red universities should not even appear in the research rankings, green universities should (but don't).

Most of the "leading" universities are ranked properly, but a few are comical. In particular, Essex appears 21 places higher than it should!

For completeness we did the same study for the CUG2015 rankings. The UK government exercise used for the CUG2015 (RAE 2008) was slightly different - mathematics was broken into three categories (pure, applied and statistics). CUG2015 used the same methodology as before, but averaging across the categories if they submitted any researchClearly this is highly misleading - Portsmouth for instance only submitted applied maths research and so was ranked as 6th overall. So, in the following table we conduct the same study as before - the first two columns are the published score and rank, the next two are the same but done properly, and then in the next we look at the change in ranking of each university. However, in the next columns we give an alternative rank where we properly weight the disciplines of each university (giving 0 if nothing is submitted), and then again the change in ranking compared to the published CUG2015.

So does anything change? Yes - A lot! Heriot-Watt appears 6 places lower than it should (18th vs. 12th). Using the alternative ranking Heriot-Watt should appear at 9th. Comically (and the most extreme example),  Portsmouth drops from 6th to 37th place! In total, there are 570 ranking changes between the alternative ranking system and the published one!

So, in conclusion one can argue whether the publication of research rankings are useful or not. However, if they are to be used then please can we compile them properly as some of us ([C] for instance) made misinformed decisions based on them. But of course, if you are the vice-chancellor of an under-preforming university you may instead want to implement the following optimal strategy:

  1. Hire some top researchers doing top research!
  2. Submit to the UK research exercise top research and get top results.
  3. Sack everyone, pull down the building!
  4. Never submit again.
  5. PROFIT

"If it cannot be expressed in figures, it is not science, it is opinion."
Robert Anson Heinlein