A new international effort to gauge the performance of universities went online on Tuesday, promising to be a nuanced tool for students and institutions in the highly contentious field of global rankings.
U-Multirank, a project announced in 2011 and backed by the European Union, aims to foster greater transparency about higher education around the world, including in the United States. But while its approach has received praise, some experts say it still has a way to go before achieving its goals, and some higher-education groups have already raised questions about its methods.
On its website, U-Multirank allows users to select their own criteria and generate customized results. It compares five performance areas—teaching and learning, research, knowledge transfer, international orientation, and regional engagement—and grades them from A (“very good”) to E (“weak”). The indicators for knowledge transfer, for example, include the number of joint publications with industry. Another indicator reflects how often publications from a particular institution are cited in patents.
The approach was seen by the European Union, which is providing more than $2.7-million for the initial phase of the effort, as an improvement on traditional rankings, like the ones produced by QS and Times Higher Education. The project’s designers, which include the Center for Higher Education Policy Studies at the University of Twente, in the Netherlands, and the Center for Higher Education Development, in Gütersloh, Germany, say it is the first international ranking to reflect the diversity of higher education by including all types of institutions, including specialist colleges, smaller regional institutions, and universities of applied sciences.
Yet like other rankings, U-Multirank has faced its share of criticism.
Last year the League of European Research Universities, which represents some of the top research institutions in Europe, withdrew its support for the project, citing concerns about the lack of reliable data, among other issues.
Gero Federkeil, who coordinates U-Multirank and oversees the rankings of German institutions by the Center for Higher Education Development, said the league’s criticism “is no longer applicable.” Katrien Maes, the league’s chief policy officer, said in an email to The Chronicle that the organization would not comment on U-Multirank until it could examine initial results.
Little Interest Stateside
Data for some 870 institutions in 70 countries were analyzed for the project, including more than 100 institutions in the United States. However, at least for most of the British and American universities, much of that analysis was conducted based on publicly available information, like patent databases and scientific publication and citation data. Fewer than 20 institutions in both Britain and the United States actively provided data for the project.
The few in the United States are a mix of institutions including Dartmouth College, Oregon State University, and Tufts University. Many American institutions pay little attention to international rankings, keeping a closer eye on the domestic ones from U.S. News & World Report. But Steve Clark, a spokesman for Oregon State, said the university had decided to take part because “we were interested in seeing how things played out and how we compared.” The university made its decision, he said, without knowing how many other American institutions were participating.
Mr. Federkeil said he was not concerned about the lack of participation from the United States, adding that he was confident that more institutions would want to take part in subsequent versions.
‘A Very Big Deal’
Ellen Hazelkorn, director of research and enterprise at the Dublin Institute of Technology and an expert on university rankings, said the project fell short of challenging the existing ranking efforts, as it promised to do. The U-Multirank indicators suffer from the same limitations as those used by all the others, she said. By relying on proxies for substantive information, they become an exercise in measuring only what you choose to measure. For example, “every European country has a different patent process,” and using patent citations as an indicator for research strength has little meaning, said Ms. Hazelkorn, who also writes for The Chronicle’s WorldWise blog.
Still, the new tool has moved the rankings discourse forward, she said. “I think the best way to describe U-Multirank is it is part of a wider trend toward more public disclosure; the train has left the station, and there is no going back.”
Thomas D. Parker, a senior associate at the Institute for Higher Education Policy, in Washington, D.C., was more enthusiastic. He said that U-Multirank should be higher on the radar of American educators. “This is a very big deal,” he said. “Until now, rankings have been somewhere between bad journalism and bad science, and we’re beginning to see much more serious work.”
Dubbing U-Multirank the “thinking man’s ranking,” he pointed out that it’s more akin to a consumer-information system, and that the approach also underpins President Obama’s proposed college-ratings system.
European students have been at the forefront of the push for greater transparency in higher education that led to U-Multirank, Mr. Parker said, and their American counterparts should heed their example. “If I were a U.S. student, I would be saying we want something like U-Multirank here,” he said. “But U.S. students don’t speak with the same kind of unified voice as Europeans do.”