← Previous · All Episodes · Next →
Data, Decisions, and Disruptions: Inside the World of University Rankings S3E20

Data, Decisions, and Disruptions: Inside the World of University Rankings

· 26:08

|

Sam Pufek: You're listening to the World of Higher Education podcast, season 3, episode 20.
Alex Usher: Hi everyone, I'm Alex Usher and this is the World of Higher Education podcast.
University rankings are pretty much everywhere. Though the earliest university rankings in the U. S. date back to the early 1900s and the modern ones from the 1983 debut of the U. S. News and World Report rankings. The kind of rankings we tend to talk about now, international or global rankings, really only date back to 2003 with the creation of the Shanghai Academic Rankings of World Universities.
Over the decade that followed that first publication, a triumvirate emerged at the top of the rankings pyramid. The Shanghai Rankings, run by a group of academics at the Shanghai Jiao Tong University, the Quacquarelli Symonds, or QS Rankings, and the Times Higher Education's World University Rankings. Between them, these three rankings producers, particularly QS and Times Higher, created a bewildering array of new rankings, dividing the world up by geography and field of study, mainly based on metrics relating to research.
Joining me today is the former Chief Data Officer of the Times Higher Education Rankings, Duncan Ross. He took over those rankings at a time when it seemed like the higher education world might be running out of things to rank. Under his tutelage, though, the Times Impact Rankings, which are based around the 17 UN Sustainable Development Goals, were developed. And that's created a genuinely new hierarchy in world higher education, at least among those institutions who choose to submit to the rankings.
My discussion with Duncan today covers a wide range of topics related to his time at THE. But the most enjoyable bit by far, for me anything, was the bit about the genesis of the impact rankings. Listen a bit, especially when Duncan talks about how the Impact Rankings came about because the THE realized that its industry rankings weren't very reliable. Fun fact, around that time I got into a very public debate with Phil Beatty, the editor of the Times Higher, on exactly that subject. Which means maybe, just maybe, I'm kind of a godparent to the impact rankings. But that's just me. You may well find other points of interest in this very compelling interview. Let's hand things over to Duncan.
So Duncan, let's start at the beginning. I'm curious, what got you into university rankings in the first place? How did you end up at Times Higher in 2015?
Duncan Ross: I think it was almost by chance. I had been working in the tech sector for a large data warehousing company, and the result of that was I was working across many industries, almost every industry apart from higher education. And I knew I was looking for a new challenge, something that was very different. And a friend approached me and said actually that she was aware of this role that might be of interest to me. So I started talking to Times Higher Education and it turned out it really was of interest to me.
Alex Usher: So when you arrived at the Times in, in, in 2015, the company already had a pretty full set of rankings products, right? So it had had the global uh, rankings, I think the regional rankings came in around 2010. And then the subject rankings or, or field of study rankings I think they call it at, a couple of years later. When you looked at all of that, what did you think? What did you think needed to be improved?
Duncan Ross: Well, the first thing I had to do was actually bring all of that production in house because although they had rankings, the rankings were produced for them by Clarivate. Uh, Well, Thomson Reuters, as it was then who were doing a perfectly good job, but as you may be aware, if you're not in control of these things, there's a limit to what you can do with the data.
And the other thing to say about this is that although on the face of it, it looked as if they had many rankings, in practice, they had one ranking, the World University ranking, and all the other rankings were really just cuts of that same data. And when you looked at the World University rankings, there were only 400 universities, and it was dominated by Europe and North America. Almost um, 26, 27 percent of those 400 universities were from the USA. It really didn't reflect the global world of higher education. So the challenge was, well, what could we do to explore a broader world of higher education that actually reached out beyond the usual suspects? And also, were there other things we could explore and measure rather than just those research centered metrics. There are good reasons why any international ranking agency is going to start with research, that's the data that's most consistent, but it's certainly not the only way you can think about excellence in higher education, as I'm, I'm sure you're very well aware.
Alex Usher: Oh, yeah. So what did you do to to deal with that geographic diversity? Was it just as simple as saying, we're not going to look at 400, we're going to look at, I think they're up over a thousand now, aren't they? I've forgotten what the number
Duncan Ross: 2, 100 and something odd, and in practice, they're even larger than that, because we introduced about two years ago, this concept of reporter institutions, so institutions who haven't yet met the criteria for being in the world ranking, but who are providing data. Now, the world ranking is artificially limited because we have to put a, a, or rather they have to put a threshold on participation based on the number of research articles that are published.
Now, if you said, well, how many um, and that's 1000 papers over a five year period. If you said how many potential universities could meet that criteria, it's probably about 3000, it keeps growing. But that is still only a fraction of the number of universities in the world. There's probably about 30, maybe even 40, 000 higher education institutions, even before we start looking at things like community colleges.
So really expanding it was firstly about, well, let's not have an artificial boundary, let's reach out, let's do work in those other parts of the world that we're not currently working in. Let's try and think about things in a less anglo centric way. One of the things, one of the challenges I constantly have, and this is something that people inevitably fall into, is that you tend to think about higher education from your own background and your own perspective. And that isn't necessarily the way that higher education works around the world. And it's so easy to kind of get trapped into thinking everything should look like, you know, the universities of Canada or the universities of the U. S. or the universities of the UK. And that's simply not the case. So we had to reach out. We had to be open. We had to try and think carefully about some of the challenges around collecting data on that global scale as well. And so now, I think Times Higher Education probably has data on about 5, 000 to 6, 000 universities. So a big step up from the 400. Still only a fraction of those around the world.
Alex Usher: Interesting, well, it's interesting 'cause that's exactly the sort of the mission of this podcast is to get people outside that anglo, that anglo centric kind of view of the world. So, you know, so I take, I take your point that, you know, when you got there first couple of years, you're there, most of what you're doing is just one set of data, but you're cutting it in different kinds of ways.
But you know, with that, it's not simple to collect data for rankings, right? It's, It's tricky and you have to make a lot of, decisions you know, which are mostly around inclusion, right? How do you include things? How do you weight things? And you had a couple of big ones, I think that you had to deal with in the, in your first few years. Well, one in your first few years and one in the last couple of years. So one was about fractional counting of articles. I remember that one went on for quite a while. Once, we had this big surge of CERN related articles out of, I guess, out of Switzerland, but with thousands of authors from around the world and how that changed the weighting. And so there was a move to fractional weighting, which in theory equalized things a bit, but not everybody agreed. And more recently you've had an issue about voting, right? So I got what I think was called a cartel of voters in the Middle East with respect to the reputation rankings that you do. Can you talk a little bit, like, how do you handle those kinds of things?
Duncan Ross: Well, I think the starting point is to say that you're really trying to evaluate things in a fair and consistent way, and inevitably you're then dealing with a very noisy and messy world. And I think there are two, two very different cases you raised there. The one is, how do you cope with the norms of the higher education sector? And publishing is one of the big areas there. Lot of people, especially academics who come from within a single discipline, again, get into that mindset of publishing works the way it works in my discipline. And therefore you can build a set of rules and you can apply them universally and they'll be perfect for everyone. And of course, that's simply not the way it works. The concept of first author doesn't exist in all disciplines. The idea of having, you know, your PI at the end of the list of authors, again, doesn't happen in all disciplines. And one of the things we found was that there were a category of or a group of subjects where, really, they were dealing with big science. And in big science, you have a really fundamental challenge, how do you recognize the many hundreds or thousands of people who participated in a project, but who aren't going to be amongst the three or four people who actually write up the work at the end of the day? And if we go back to the 20s, high energy physics made a decision. They said when we do experiments, everyone who participates above a certain level, and we will have a committee because it's academia, we're bound to have a committee to decide who has reached that threshold, but everyone who reaches that threshold gets listed on the papers in alphabetical order.
But of course, that distorts things if you have 5,000 authors on a paper. So we had to come up with a mechanism for dealing with that. And in an ideal world, what you'd want to have is a metric which is the same for everything. If we think about the way that physics works, for example, we don't want to have one model of gravity that we use in some circumstances, but another model of gravity that we use in different circumstances. That's not a very satisfactory way of doing it. You want to have one approach. Unfortunately, sometimes you have to do that. Now, what we're doing or what they're doing now is they're moving towards some more sophisticated bibliometric measures, which hopefully deal with that kind of approach in a different way.
And the second thing that we talked about, or you mentioned there, was something that is fundamentally different, and that's where there is evidence of inappropriate behavior. And this is not always at an institutional level, this can be at an individual academic level. We are seeing that in the world of publishing at the moment, where there are academics out there who are publishing more than 200 articles a year, which I think is kudos to them for their productivity but yes, it's a question of whether that is viable. And there you're taking a very different approach. There it's about putting things in place to identify and to penalize misbehavior. But it's tough because you don't want, at the same time, you don't want to become a judge and jury. We, you know, it's very hard, particularly when we see things at a statistical level, we can see and we can infer that things are going on, but we don't, as it were, have a smoking gun. So that's some of the thought process behind that, but the goal is to be as fair and equitable as possible.
Alex Usher: We're going to take a short break. We'll be right back.
Sam Pufek: This podcast is brought to you by Higher Education Strategy Associates. In response to the widespread retrenchment across Canadian higher education, HESA has launched the Recovery Project. The financial challenges facing Canadian higher education are unprecedented, but they are not insurmountable. The HESA Recovery Project helps Canadian colleges, polytechnics, and universities navigate financial challenges by providing insights and facilitating peer learning and collaborative action.
Through monthly reports and virtual meetings, leaders gain evidence based strategies on budget decisions, maintaining morale, and academic redesign. Reports and discussions begin this month with future topics shaped by member needs to ensure timely, relevant support for institutions adapting to financial pressures.
For more information, visit higheredstrategy. com or visit the episode description for links.
Alex Usher: And we're back. You know, Duncan, I guess, you know, you've, you hinted at this in one of your earlier answers, but I want to turn now to the impact rankings, right? I mean, this is the big thing that you introduced at the Times Higher. Tell us about the genesis of those rankings. Where'd the idea come from? Why impact the way, and why these SDGs,
Duncan Ross: So, it never started out with the intention of focusing on sustainability. It was actually a challenge put to me by Phil Beatty my good colleague. He'd always worried that there wasn't enough of enough measurement around the technology transfer element of the World University Ranking. So we went out and collected a set of data from universities about technology transfer. We looked at income from consultancy. We looked at university spinoffs and the data that came back was absolutely terrible. It was just all over the place and fundamentally not usable. And so I went back to the drawing board, and when I did that, I came across SDG number nine, industry, innovation, and infrastructure. And I looked at that and thought, actually, this is kind of interesting. It's an interesting because it's an external approach. So one of the challenges of, by the way of doing rankings is that people can always challenge your model. You know, is this a good model for excellence? The advantage of an external model is that sure, you can challenge me on it, but I'm just going to point you at the United Nations and say, go argue with the United Nations.
So I found SDG number nine and having done a bit of data science and come across the tank problem, I assumed that there were probably in the region of 13 to 18 SDGs out there. Sorry, that's a data science joke. They don't work very well in 99 percent of all circumstances. Anyway, I assumed there were more SDGs out there. It turned out there were. And exploring the SDGs really, it was like one of those light bulb moments. It sort of came across as that this would be a really useful framework to understand probably the most positive role that universities could play in the world as it is at the moment. We all know that we're facing, well, those of us outside the U. S. certainly know that we're facing a climate catastrophe. Higher education has a really significant role to play in that. How can we support that? How can we measure that? How can we encourage better behavior from this really important sector?
Alex Usher: The impact rankings are very different in the sense that roughly half the indicators in like 240, 250 across all 17 of them
Duncan Ross: I should say universities don't have to supply data for all
Alex Usher: No, I know, I know, but well, I'll come to that in a second. But you know, they are not, half the indicators are not naturally quantifiable, right? They're actually, they're stories. So I write up, this is what we do in area X. You know, it could be about this is how we combat organized crime. This is how we you know, this is how we make sure that we're sourcing you know, organic food for the, the institution, those kinds of things. And they're scored based on institutional submissions. So somebody, I don't know how the Times does it, but there's gotta be some way that you're scoring these. How do you ensure that those kinds of institutional answers, which, so you're talking about 120, 130 answers maybe per institution maximum. And you've got hundreds of institutions doing this, so how do you score them fairly and consistently?
Duncan Ross: Well, I can tell you that we have over 2, 500 institutions doing that
Alex Usher: Good lord, that many now.
Duncan Ross: data this year. So it's, it's really expanded hugely. So, one thing to say though, is that it's not strictly the case that these are written up. So it's not like the teaching excellence framework in the UK, where you can hand in an essay explaining why you didn't score as well as you thought you were going to. Um, You know, the famous, the dog ate my student statistics paper. Instead, what we're asking is for evidence of the work that you have done. And sometimes this will be policies. Sometimes this will be procedures. Sometimes this will be examples of the work that you've done. So, and the scoring approach is relatively straightforward. So basically, do you say you do something? And we'll give you some credit if you say you're doing something. Then we look at the evidence that's been supplied. And we make a judgment about whether or not that is what they say it is. So that's the second part of the scoring. And the third part, which I think is really important is that you get extra credit where that is a public document. So sure, you can write up stuff or you can point to evidence that may not be the case, but if you do so, you're opening yourself to challenge. My favorite example would be around things like SDG five, gender equality, and around gender pay equity. If you have a policy on gender pay equity, do you publish it? If you publish it and you're not living up to that, I guarantee you, or at least I hope, that women in your institution will challenge you on that basis as an institution. So that's part of the balancing aspect.
Now, of course, how do we evaluate that? Until this year, that was done by a team of assessors. So, essentially, we brought in people, we trained them up, we supported them with some of our regular staff, we had a layer of checking and we would check against previous answers and so on. So essentially people were looking at making, we have individuals who are making those decisions. This year we're introducing, and you won't be surprised to hear this, AI as an approach to help us go through this. And that allows us to filter out a lot of the more easy decisions, and I'm sure that people are left with the more challenging decisions. And it's also a way of ensuring that you don't suffer from assessors getting tired. When you've seen 15 different answers on the same question by different universities, it can get a little tedious after a while.
Alex Usher: Yeah, it's like that, that, that experiment with Israeli judges, right? Like you don't want to be the last one before lunch. You get a much harsher sentence or score, I guess, if the judge is judging on an, on an empty stomach. You might, you must have that kind of issue to deal with. I've been impressed by how enthusiastically the Impact Rankings have been embraced. Certainly they've been very, uh, you know, Canadian institutions have taken them. I think we're four of the top 10 this year or four of the top 10 last year, three of the top 10 this year, which we don't get that often. But but it hasn't seen a lot of take up yet, maybe it changing this year, in either China or the United States, which of course are probably the two biggest national players in the world of, you know, of research based university rankings. Why do you think there's been such a different reception in different parts of the world? And what does that say about the role, about the way that different parts of the world see the purpose of universities?
Duncan Ross: I think that there's definitely a case that different geographies, different nations have different approaches to the SDGs. China, as you might imagine, is very much based around the degree to which it fits in with current Communist Party thinking. And you could argue the same from the US, you know, the current, the incoming administration in the US is making it fairly clear that SDG 10 which is uh, reduced inequalities, SDG 5 around gender are not going to be top of their priorities. And so it's not probably SDG one, no poverty as well. So, some of it is reflecting the political attitudes off the government. But sometimes it's also reflecting the economic structure of the higher education system itself. So if you look at the U. S. with their exceptionally high fee system, clearly, universities need to raise money. What they're really interested in is the dominant ranking there, which is US News and World Report. That is, is the 600 pound gorilla there. And if I was in their position, I'd do the same. That's the one that's going to get me the applications.
But in other parts of the world, and this harks back to our earlier comments about the idea of different priorities in different parts of the world. They view rankings quite differently. And if you're a university say your ITS uh, in Indonesia, how can you get visibility? How can you demonstrate that you're different than some of the other universities? And there are 4, 000 universities in Indonesia alone. So it's, you know, it's a huge country with a huge higher education sector. Well, the impact rankings gave them an opportunity to actually demonstrate the great work that they were doing in a very different way. And I think one of the things I'm proudest about in the impact rankings is that unlike the world rankings, or as you said earlier, the teaching rankings, it's not all of the usual suspects at the top.
I love, for example, University of Western Sydney. It's a fantastic institution. If you're ever in Sydney, take the train out, keep on the train, stay on the train. It's a long way out of the center. Go and visit them, have a look at the work that they're doing not just in terms of the work they're doing environmentally, but also the work they're doing with the Aborigine and Torres Island Strait peoples on whose lands they are based. They're doing some amazing work. And I'm so pleased that we've been able to raise the profile and raise the visibility of some of these institutions who wouldn't otherwise necessarily have got the recognition they deserve.
Alex Usher: But you're still left with the problem that the institutions that do really well in the research rankings, an awful lot of them have in effect boycotted these rankings, right? Because they're not guaranteed to come first. Why, I'm only going to participate in rankings where I know I'm going to come first.
And I know at the beginning, at least you had that issue with the LERU, the Leading European Research Universities. And still, I guess with the United States you know, numbers are down. Do you think eventually the Times will crack that? Not, I mean, it's a, it's a really hard nut to crack. I mean, you know, the OECD fell down on a helo for more or less the same reasons, right? It was the same people who were saying rankings are terrible and we don't want better ones. You know, so, what do you think about that?
Duncan Ross: So I've got a got to say a brief anecdote about the whole rankings boycott approach, which is that I remember there's what, I'm not going to name the university concerned, but a particular university was very public in saying we're withdrawing from the Times Higher Education World University Rankings, by the way, which is something you can do because it's voluntary. Not all rankings are voluntary. But they withdrew very publicly. And about a month later, we got an email from their graduate studies department saying, please, can we have a copy of your rankings? Cause we use it to evaluate who we're going to interview.
So, there is this kind of odd odd mindset going on. But I think from the perspective of the impact rankings, I'm pretty casual about it. I don't, I mean, it would be nice to have Oxford in, it would be nice to have Harvard in, but MIT participates and they're a reasonably good school, I believe. Spiderman applied there, so it's got to be reasonable.
And those so called top universities, they've got plenty of rankings they can participate in. And those, you know, if you say there are 300 top universities in the world, well what about the other 36, 40,000 institutions?
Alex Usher: Sure, ya. just wannaa end on a slightly different note. I was doing a bit of background research for this interview and I read about your involvement, and I think you founded this organization DataKind, which is a data charity, which, I've never heard of a data charity before and I'm fascinated by it and kind of intrigued enough to think about starting one here. But tell us about DataKind and what it does.
Duncan Ross: Thank you very much. So DataKind was actually set up in the U. S. by Jake Paulway. And, what happened is that I went to one of the early big data conferences, O'Reilly's Strata Conference in uh, in New York, came across Jake. He was talking about how data could be used for good. And I had been involved in leadership roles in a number of UK charities and it just, a light— second light bulb moment. And so I went to Jake and said, you know, please let me found an equivalent in the UK. And he said, yeah, sure, sometime. And I just kept nagging him and nagging him until eventually he gave in and said, yes. So, together with some amazing people here in the UK, Fran Bennett Kaitlin Thaney Stuart Townsend, we set up DataKind UK. But the concept is very simple. It's that when you say that actually you can, if you're a telco or you're a retail company or you're a finance company, you can use data to do more things. You can actually use it to be a better organization. The same is true in the third sector. The difference is banks can afford to employ data scientists, charities frequently can't. So DataKind UK was set up and DataKind was set up really to allow allow data scientists to volunteer their time for the third sector. Now, there are some things that are needed. You need an organization that has the leadership to be willing to do this, that has a good problem that can be described analytically, and that isn't always the case. And finally, it has some data. Because, you know, it doesn't help if you don't have the data. But DataKind UK and DataKind in the US and elsewhere in the world have done some amazing work with the sector, helping organizations to understand what their data is telling them, and therefore how they can change, how they can be better with their resources, how they can do more for their for the people that they are serving.
Now I stood down from DataKind UK back in 2020. I'm a firm believer that if, if you've done something really good, the true test of that is when you step away and it continues and it survives. I'm pleased to say it is still thriving in the UK. I kind of hope that, you know, the impact rankings continue to thrive at Times Higher Education now that I've left.
Alex Usher: Yeah. Well, thank you for joining us today, Duncan.
Duncan Ross: It's been a pleasure.
Alex Usher: And it just remains for me to thank our excellent producers, Sam Pufek and Tiffany MacLennan. And you, our viewers, listeners, and readers for joining us today. If you have any questions or comments about today's episode, please don't hesitate to get in touch with us at podcast at higher ed strategy. com. Worried about missing an episode of the World of Higher Education? There's a solution for that. Go to our YouTube page and subscribe. Next week, our guest will be Jim Dickinson. He's an associate editor at Wonkhe in the UK, and he's also maybe the world expert on comparative student politics. And he joins us to talk about the events in Serbia where the student movement is challenging the populist government of the day. Bye for now.

View episode details


Creators and Guests

Alex Usher
Host
Alex Usher
He/Him. President, Higher Education Strategy Associates
Samantha Pufek
Producer
Samantha Pufek
She/Her. Graphic Designer, Higher Education Strategy Associates
Tiffany MacLennan
Producer
Tiffany MacLennan
She/Her. Research Associate, Higher Education Strategy Associates

Subscribe

Listen to The World of Higher Education using one of many popular podcasting apps or directories.

Apple Podcasts Spotify Overcast Pocket Casts Amazon Music
← Previous · All Episodes · Next →