• October 20, 2014

'Times Higher Education' Releases New Rankings, but Will They Appease Skeptics?

On Thursday the London-based Times Higher Education releases its new, and heavily hyped, World University Rankings. Nearly a year in the making, the rankings have been highly anticipated, if only to determine whether the magazine has truly delivered on its promise: to create an evaluation system based primarily on reliable, and quantifiable, measures of quality rather than on subjective values, such as reputational surveys.

Times Higher Education produced rankings for the first time this year without the collaboration of Quacquarelli Symonds Limited. Along with the Shanghai Jiao Tong University rankings, the World University Rankings that Times Higher Education and QS published together from 2004 until last year have become the most closely watched and influential university rankings in the world.

Quacquarelli Symonds has continued to produce those rankings, now called the QS World University Rankings, and is partnering with U.S. News and World Report for their publication in the United States.

The relationship between the former collaborators has deteriorated into barely veiled animosity. QS has accused Times Higher Education of unfairly disparaging the tables they once published together. This week the company threatened legal action against the magazine over what Simona Bizzozero, a QS spokeswoman, described as "factually inaccurate" and misleading statements by representatives of Times Higher Education. She said THE's role in the collaboration was limited to publishing the rankings based on a methodology that QS had developed. "What they're producing now is a brand-new exercise. A totally brand-new exercise, with absolutely no links whatsoever to what QS produced and is producing," she said. "So when they refer to their old methodology, that is not correct."

Phil Baty, editor of the rankings for Times Higher Education, declined to respond to QS's complaints: "We are now looking forward, not looking backward."

The release last week of the new QS rankings generated headlines, especially in Britain, with the displacement of Harvard as the world's top university by the University of Cambridge. QS's full list of the top 400 universities will be published next week by U.S. News.

Times Higher Education, by contrast, places Harvard in first place, followed by the California Institute of Technology, the Massachusetts Institute of Technology, Stanford, and then Princeton. Cambridge and Oxford tie for sixth place in the highest spot occupied by a university outside the United States.

"There is no question that this is a real wake-up call for the U.K.," said Mr. Baty. "This confirms, more than ever, that the U.S. has absolutely the world-class education system." He did note, however, that the data on which the rankings are based predate recent cuts in public financing for higher education in the United States.

Times Higher Education is now collaborating with the media conglomerate Thomson Reuters, which is providing the data on which its rankings are tabulated. Because the tables were produced using a new methodology, they represent "Year 1 of a new system," and "you can't make direct comparisons" with the previous rankings, said Mr. Baty.

Nonetheless, Times Higher Education is emphasizing what it describes as the increased rigor of its new methodology, which according to its news release "places less importance on reputation and heritage than in previous years and gives more weight to hard measures of excellence in all three core elements of a university's mission—research, teaching, and knowledge transfer."

Foremost among the criticisms of the previous compilation was that it relied too heavily on a reputational survey of academics, based on fewer than 4,000 responses in 2009. THE's new methodology is based on 13 indicators in five broad performance categories—teaching (weighted 30 percent); research influence as measured in citations (32.5 percent); research, based on volume, income, and reputation (30 percent); internationalization, based on student and staff ratios (5 percent); and knowledge transfer and innovation based on industry income (2.5 percent).

Times Higher Education said that the new system was the only global ranking to devote a section to teaching. The new methodology is much more evidence-based and relies far less on subjective criteria than the old tables, said Mr. Baty. But whereas teaching was previously measured based solely on student-staff ratio, the new rankings incorporate a reputational survey.

Skeptics Not Swayed

But will the Times's new system impress critics? If the reaction of two of the most outspoken and influential rankings experts is any gauge, perhaps not.

"Really, nothing has changed," said Ellen Hazelkorn, executive director of the Higher Education Policy Research Unit at the Dublin Institute of Technology, whose book "Rankings and the Battle for Worldclass Excellence: The Reshaping of Higher Education" is due to be published in March.

Despite Times Higher Education's assurances that the new tables represent a much more rigorous and reliable guide than the previous rankings, the indicators on which the new rankings are based are as problematic in their own way, she believes. The heavily weighted measure of teaching, which she described as subjective and based on reputation, introduces a new element of unreliability.

Gauging research impact through a subjective, reputation-based measure is troublesome enough, and "the reputational aspect is even more problematic once you extend it to teaching," she said.

Ms. Hazelkorn is also troubled by the role Thomson Reuters is playing through its Global Institutional Profiles Project, to which institutions provide the data used in the tables. She dislikes the fact that institutions are going to great effort and expense to compile data that the company could then sell in various ways.

"This is the monetarization of university data, like Bloomberg made money out of financial data," she said.

Geoffrey S. Boulton, a leading University of Edinburgh academic who wrote a recent report, "University Rankings: Diversity, Excellence and the European Initiative," for the League of European Research Universities, agrees that the new rankings do not represent a significant improvement. "One of the problems is that you have a system that is not well designed for purpose, and collecting more information will add nothing at all," he said.

Merely adding more detail, as he said the new rankings had done, obscures the underlying problem, which is that rankings depend on inherently unreliable proxy measures to assess the things they purport to be measuring, he said.

Coming up with an effective way of measuring teaching excellence, for example, is just one hurdle.

"I can think of lots of proxies, but the most fundamental proxy of all is the ethos and commitment of the people in the place, and how can you measure that?" asked Mr. Boulton. "The only way, in a sense, is by going to a place and sensing it, and this is not practicable and is profoundly subjective."

Unfortunately, he noted, the effect of rankings placing so much emphasis on proxies for teaching excellence, such as the number of academic staff who have Ph.D.'s, is that teaching may in fact be suffering.

The combined impact of the influence of global rankings and the weight they give to research, together with Britain's national program for allocating university financing based largely on research, mean that in British universities, "the dominant driver of activity is research, and often not research of a very high quality," said Mr. Boulton. "The consequence is that many of the best teachers have felt rather alienated."

Despite their skepticism of the rankings' inherent worth, both Ms. Hazelkorn and Mr. Boulton acknowledge that rankings are an unavoidable feature of today's higher-education landscape.

"Given that they are here to stay, they will no doubt become more elaborate, and one of the key issues is who is this going to influence and is the influence it has on them appropriate, proper, and sensible," said Mr. Boulton.

Comments

1. tj2010 - September 15, 2010 at 08:00 pm

Both the Times and QS are still very subjective. Jiaotong is based on objectives measures, but its questionable as to whether they're relevant at all. A new ranking iniatiative from Australia which tries to counter these problems certainly seems interesting:
www.highimpactuniversities.com

2. cewatt - September 16, 2010 at 08:13 am

As always, what I fail to understand is "why." What does this do for any of us? We have presidents obsessed with rankings, parents worrying if they matter, and endless wranglings about what they mean. I may be old-fashioned at this point, but one cannot capture what makes each of our schools wonderful and special and excellent cannot be captured in any ranking simply because it varies for each student.

Granted, we have driven this madness ourselves. The rankings would not be what they are unless our schools fell for the game. Somehow we just can't resist playing along and facilitating the madness.

3. 22228715 - September 16, 2010 at 08:26 am

I tell my students that colleges and universities are like people, with their own personalities and histories and psychological hang-ups. If you made a list of everyone you know, can you say that some of the them are more smart/attractive/funny/generous/athletic than others? Sure. Will that make a difference when you choose to spend time with someone for the purpose of advice/romance/dining/discussion/learning/tennis? Sure. Is it a useful exercise to rank order each and every one in order of their combined smartness/attractiveness/funniness/generosity/athleticism and to spend extra time making sure the top 10 are in exactly the right order? Umm....

4. 22228715 - September 16, 2010 at 08:34 am

Just read some of the rankings... the one-paragraph descriptions of universities have glaring factual errors, that are correctable by reading the intro web pages of institutions. It's hard to take the other data seriously when the basic info is wrong.

5. oscar003 - September 16, 2010 at 08:38 am

The best universities operate their teaching and research programmes at such a high level of abstraction that it is not possible to devise "objective" methods of assessing their performance. It may be reasonably clear that Oxford and Cambridge are the best two universities in the United Kingdom but it cannot be claimed that they are better or worse than Harvard or Yale in the United States. The Times Higher Education Supplement's ranking list is really neither here nor there in this respect.

6. dank48 - September 16, 2010 at 09:04 am

"Tweedledum and Tweedledee resolved to have a battle
For Tweedledee said Tweedledum had spoilt his new rattle."

Lewis Carroll somehow saw this coming a century and a half ago; what prescience.

7. xtrcrnchy4 - September 16, 2010 at 09:43 am

One of my favorites is how so many folks used to vote for Princeton's law school as one of the best in the U.S., until they were made to realize that Princeton doesn't have a law school. This is no different, just more elaborate.

8. bstevens - September 16, 2010 at 09:48 am

When we had some visitors from Chinese universities, the first thing they did was rush to the downtown bookstore to buy college ranking books. Apparently they use them with students who want to come to the U.S. for college. I could understand if in China a book of college rankings were published by the Communist government, that it would be taken as Gospel, sort of like a command. But here, it really is just another book to be sold or a marketing ploy for some other publication. The fact that naive parents (and curious college administrators) still buy them proves they are a good investment. Pretty much useless to the buyer, but great for the seller.

9. dank48 - September 16, 2010 at 10:20 am

And that, Bstevens, tells you what you need to know about the publishing industry, aka the book biz. That and the fact that, for really poor business reasons, we once in a great while turn out a decent book or two, if only because we just can't help ourselves.

10. pankaboi - September 16, 2010 at 10:56 am

What is the use of rankings? Best universities, good universities, and average universities produce, every year, in their own fashion, excellent and dedicated graduates who serve their communities and the world well. In addition, good ideas and inventions do not come from only top ranked universities. Many often come from so-called low class universities. It makes little difference to the human community from which university useful knowledge and invention come from. We should, therefore, rather encourage each and every student in the world's universities to contribute his or her mite of scholarship to human knowledge, development and progress. We should encourage each and every teacher/researcher to give his or her best to his or her university.

Stop breaking the hearts of hardworking and committed teachers, researchers and students with these rankings! Stop the rankings!

11. davi2665 - September 16, 2010 at 02:43 pm

What is this insane obsession with "rankings." They are at best artificial, and capture meaningless numbers and statistics. If the top administrators and alumni want to get together for comparative statistics, let them participate in the major obsession of the governing boards of universities- "My endowment is bigger than your endowment." How impressive!

12. thirtyeyes - September 16, 2010 at 02:59 pm

They seem kind of silly, but the first thing I did after reading this article was google the list and find my alma mater. #54!

13. bob_malooga_looga - September 16, 2010 at 03:57 pm

Why the rankings exist: ego.
Why the rankings are here to stay: money.

14. simonj55 - September 16, 2010 at 05:58 pm

Funny what you can turn into a business.

15. drstarsky - September 16, 2010 at 07:38 pm

"Ecole Polytechnic". Sorry, I cannot take this seriously.

16. brebenne - September 17, 2010 at 12:06 am

I'm a bit surprised with the rankings, as are many people around the world. What is clear is that the rankings don't particularly add up for many countries and schools.

The biggest problem that people I've met have noticed is that there is no ranking for the University of Texas at Austin. The THE really should account for this glaring error. What makes this even more shocking is that the THE had them at #15 once. The statistical data can't honestly have changed that much.

There is also a great bias towards science and tech schools that should be acknowledged more freely.

We should continue to make more and more rankings so that there rankings have no meaning anymore. I feel that this is happening with the plethora of rankings now from Australia, China, Taiwan, and the UK.

Students from China do use this ranking some, as people have noted, although Chinese students realize that American postgraduate programs pay for PhD students whereas British and Australian schools don't. Chinese studnets go to Australia largely for visas, and now that the Aussie government has changed its regulation, numbers have plummeted. Britain is using its last capital, as its prestige is tarnished internationally because of its fees for degrees attitude. Try paying your way into Berkeley or Texas or Harvard - you can't. Chinese students chose where to study based upon prestige, jobs, and the possiblity of getting a visa. If there is no visa, then they would rather study in America. For example, the average elite chinese student would rather go to currently unranked Texas rather than the National University of Singapore or Melbourne, all things being equal (unless we are talking about a visa), especially when you factor in the tiny funds available for non Australian students.

Someone should do a ranking that looks at where the faculty gained their PhDs and where the university's PhDs get jobs. There was a US study that showed the tiers. What we would find is that very few of the top 200 get hired at the top 200 schools - that is to say, about 30 schools produce the majority of the PhDs for all 200 schools. This would be revealing and would ultimately show the pretige and research value of a university or program.

17. cmanderson - September 17, 2010 at 01:31 pm

From bab moolga looga:

Why the rankings exist: ego.
Why the rankings are here to stay: money.

That says it all for me. Just wish my superstitious high school senior would believe it.

I want to know how these schools encourage student camaraderie and positive interaction. I want to know which schools help students learn the value of serving the community upon graduation rather than just feathering their nests and going around feeling entitled.

Some graduates from high-ranked schools don't have great people skills. They are full of themselves because of where they went to school but can't interact with others very well. Know it alls.

18. richardtaborgreene - September 17, 2010 at 06:29 pm

Globalization means we all, when born, should have a metal tag attached to each of us, with our "at birth" ranking inserted in the tag, and with regular updates throughout our lives. That way any two people at any time in life could compare their rankings and allocate conversation, tea, women-furniture, and other things appropriately. For those deeply obtuse, I am being sarcastic here.

We are all monkeys and have well established brain modules for endless comparison and status qualms with our fellow monkeys. Publications can make endless money using this.

If I am an individual wanting to attend a university or a company wanting to hire good people output from universities---what do the rankings do for me?
A lot, really, even if it is all placebo effects--we think Harvard guys are best so we treat what they do and say specially, making them, after the fact, best (while they rip off 10trillion USDollars from the world economy in their greedy self concern--a major of all Harvard grads it seems).

19. zefelius - September 19, 2010 at 02:44 am

Do these spam ads actually work? Everytime I see one, or get one in my email, I think I never, never want to support that company.

20. gringo_gus - September 20, 2010 at 04:17 pm

It would be a mistake to consider the THES as the UK equivalent to the Chronicle. It is not, not least, because it is not written for faculty.

For a long time its significance amongst faculty has diminished. Look at its advisory board, and they are all senior administrators. Its prizes for management and leadership are all for administrators - for who would think, that for a university to lead it requires intellectual work from faculty.

It is a journal for administrators. And is purely in the interests of administrators that the belief that it is possible to identify criteria on which the equivalence or not of universities can be judged.

Here is a qualifying criterion that I propose that should be the basis for determining whether universities can be ranked at all:

"In the country where this is located, is access to knowledge via the internet in any way restricted or monitored by the state".

And in case that is seen to show a local bias, another:

"In the country where this university is located, faculty cannot be sued for libel for publishing peer-reviewed research, nor can the journals in which they are published".

So, so much for rankings...

21. nathanielcampbell - September 20, 2010 at 04:33 pm

Can someone explain to me why Teaching only gets 30% of the grade, and research combines for more than 60%? Isn't education the primary goal of the university? Or is my dream of educating my students just a delusion?

22. jeanmck - September 23, 2010 at 06:10 am

While I think rankings are basically unnecessary, I do have to at least commend the THE for trying to give some serious weighting to teaching. The 'new' Australian rankings mentioned, High Impact Universities,is purely a ranking based on research. While I realise that this really is the priority for most of the 'high powered' universities because, let's face it, that is what gains the kudos, I think that we should all be applauding efforts to recognise teaching, however it occurs.

Any ranking system will put things in that many people consider irrelevant and will leave things out that others think are crucial. I have to agree that we shouldn't really be doing it at all, but that is not the way of the world, sadly. Here in the UK, there is a whole system for ranking universities on research and allocating funding based on it and I think it has caused irreparable harm to teaching. But I also know it isn't going away because there are too many vested interests.

Having gone to a small liberal arts school for undergrad (Trinity University, San Antonio) and a larger, very high ranking research school for my PhD (ranked number 20 in the THE), it was the quality of the teaching, the individual attention and the rigor of the intellectual expectations at BOTH that I benefited from, not where they ranked in research tables. I have also published some research that indicated that the research reputation had no effect on the achievement of students given the same teaching. They really are _separate things_.

So, once again, I have to at least give the THE a small pat on the back for giving more recognition to the importance of teaching than many such rankings do.

23. saraid - September 23, 2010 at 04:01 pm

I love it. A rigorous scoring system with five broad categories, each category being at least reasonably well-defined, if arguable in utility...

...except teaching, which is a magic word.

Someone check what schools the system creators came from; those schools should get negative marks in their teaching category.

24. dan1234 - September 27, 2010 at 05:26 pm

After reading this article I just can't help but post this comment about my website that solves the college recruiting problem for students and employers and allows students to prove that they are just as smart as any other student and allows universities to prove that they educate their students just as well as any other school. www.NoodleStorm.com is a website where organizations can submit problems of any type (e.g. business, nonprofit, public administration, technology, engineering, legal, etc.) for college students to solve, either real problems or test problems. The organizations can receive multiple solutions from students from all over the world and then whittle through those solutions and just pick the top candidates to come in for an interview. This will save corporations hundreds of thousands of dollars in recruiting costs. It also gives every college student in the world the opportunity to prove they are just as smart and hard working as any other student and to compete for jobs at top companies. Students and professors from most of the top schools in the U.S. and from several international schools are signed up and organizations are submitting problems. Thanks everyone for allowing me to post this unabashedly self-serving comment.

25. niti201012 - October 13, 2010 at 09:25 pm

The Immersive Education Initiative has issued an open call for iED Summit 2011 papers, posters, workshops, panels, general presentations, demos, and outliers (novel late-breaking research and technology). Boston College will host the three-day iED Summit from May 13-15, 2011. Speakers at past iED Summits have included faculty, researchers, and administrators from The Grid Institute, Boston College, Harvard University (Harvard Graduate School of Education, Berkman Center for Internet and Society at Harvard Law School, and Harvard Kennedy School of Government), Massachusetts Institute of Technology (MIT), MIT Media Lab, The Smithsonian Institution, Loyola Marymount University, Stanford University, United States Department of Education, National Aeronautics and Space Administration (NASA), Federation of American Scientists (FAS), Duke University, Temple University, Southeast Kansas Education Service Center, Immersive Education High School, Cornell University, Amherst College, Kauffman Foundation, Boston Library Consortium, Montana State University, South Park Elementary School, Boston Media High School, Sun Microsystems, Turner Broadcasting, Open Wonderland Foundation, realXtend (Finland), The MOFET Institute (Israel), University of Aizu (Japan), Keio University (Japan), National University of Singapore, Royal Institute of Technology (Sweden), University of Essex (UK), Coventry University (UK), Giunti Labs (Italy) and European Learning Industry Group, Open University (UK), and more. For more details visit http://ImmersiveEducation.org

The Immersive Education Initiative is an international non-profit consortium that is free for educators and provides thousands of free learning virtual worlds, learning games, resources and materials for educators who want to use immersive learning technologies in and out of the classroom. The Initiative has thousands of members that teach millions of students, and some of the best of their work is showcased at annual "iED Summit" conferences that are held around the world at different times. You can get more information at http://ImmersiveEducation.org or do a google search for "Immersive Education Initiative" where you will find thousands of results.

Add Your Comment

Commenting is closed.

subscribe today

Get the insight you need for success in academe.