|Editor||Ben Sowter (Head of Research)|
|Staff writers||Craig O'Callaghan|
|Publisher||Quacquarelli Symonds Limited|
|First issue||2004 (in partnership with THE)
2010 (on its own)
QS World University Rankings is an annual publication of university rankings by Quacquarelli Symonds (QS). Previously known as Times Higher Education-QS World University Rankings, the publisher had collaborated with Times Higher Education magazine (THE) to publish its international league tables from 2004 to 2009 before both started to announce their own versions. QS then chose to still use the pre-existing methodology while Times Higher Education adopted a new methodology to create their rankings.
The QS system now comprises the global overall and subject rankings (which name the world's top universities for the study of 46 different subjects and five composite faculty areas), alongside five independent regional tables (Asia, Latin America, Emerging Europe and Central Asia, the Arab Region, and BRICS). It is the only international ranking to have received International Ranking Expert Group (IREG) approval, and is viewed as one of the most widely read of its kind, along with Academic Ranking of World Universities and Times Higher Education World University Rankings. However, it has been criticized for over reliance on subjective indicators and reputational surveys, which tend to fluctuate from year to year.
A perceived need for an international ranking of universities for UK purposes was highlighted in December 2003 in Richard Lambert's review of university-industry collaboration in Britain for HM Treasury, the finance ministry of the United Kingdom. Amongst its recommendations were world university rankings, which Lambert said would help the UK to gauge the global standing of its universities.
The idea for the rankings was credited in Ben Wildavsky's book, The Great Brain Race: How Global Universities are Reshaping the World, to then-editor of Times Higher Education (THE), John O'Leary. THE chose to partner with educational and careers advice company Quacquarelli Symonds (QS) to supply the data, appointing Martin Ince, formerly deputy editor and later a contractor to THE, to manage the project.
Between 2004 and 2009, QS produced the rankings in partnership with THE. In 2009, THE announced they would produce their own rankings, the Times Higher Education World University Rankings, in partnership with Thomson Reuters. THE cited an asserted weakness in the methodology of the original rankings, as well as a perceived favoritism in the existing methodology for science over the humanities, as two of the key reasons for the decision to split with QS.
QS retained intellectual property in the prior rankings and the methodology used to compile them and continues to produce rankings based on that methodology, which are now called the QS World University Rankings.
THE created a new methodology with Thomson Reuters, and published the first Times Higher Education World University Rankings in September 2010.
|Academic peer review||
||Based on an internal global academic survey|
||A measurement of teaching commitment|
|Citations per faculty||
||A measurement of research impact|
||Based on a survey on graduate employers|
|International student ratio||
||A measurement of the diversity of the student community|
|International staff ratio||
||A measurement of the diversity of the academic staff|
QS publishes the rankings results in the world's media and has entered into partnerships with a number of outlets, including The Guardian in the United Kingdom, and Chosun Ilbo in Korea. The first rankings produced by QS independently of THE, and using QS's consistent and original methodology, were released on September 8, 2010, with the second appearing on September 6, 2011.
QS designed its rankings in order to assess performance according to what it believes to be key aspects of a university's mission: teaching, research, nurturing employability, and internationalisation.
Academic peer review
This is the most controversial part of the methodology[weasel words]. Using a combination of purchased mailing lists and applications and suggestions, this survey asks active academicians across the world about the top universities in their specialist fields. QS has published the job titles and geographical distribution of the participants.
The 2016/17 rankings made use of responses from 74,651 people from over 140 nations for its Academic Reputation indicator, including votes from the previous five years rolled forward provided there was no more recent information available from the same individual. Participants can nominate up to 30 universities but are not able to vote for their own. They tend to nominate a median of about 20, which means that this survey includes over 500,000 data points. The average respondent possesses 20.4 years of academic experience, while 81% of respondents have over a decade of experience in the academic world.
In 2004, when the rankings first appeared, academic peer review accounted for half of a university's possible score. In 2005, its share was cut to 40 per cent because of the introduction of the Employer Reputation Survey.
Faculty student ratio
This indicator accounts for 20 per cent of a university's possible score in the rankings. It is a classic measure used in various ranking systems as a proxy for teaching commitment, but QS has admitted that it is less than satisfactory.
Citations per faculty
Citations of published research are among the most widely used inputs to national and global university rankings. The QS World University Rankings used citations data from Thomson (now Thomson Reuters) from 2004 to 2007, and since then has used data from Scopus, part of Elsevier. The total number of citations for a five-year period is divided by the number of academics in a university to yield the score for this measure, which accounts for 20 per cent of a university's possible score in the Rankings.
QS has explained that it uses this approach, rather than the citations per paper preferred for other systems, because it reduces the effect of biomedical science on the overall picture - bio-medicine has a ferocious "publish or perish" culture. Instead QS attempts to measure the density of research-active staff at each institution. But issues still remain about the use of citations in ranking systems, especially the fact that the arts and humanities generate comparatively few citations.
However, since 2015, QS have made methodological enhancements designed to remove the advantage institutions specializing in the Natural Sciences or Medicine previously received. This enhancement is termed faculty area normalization, and ensures that an institution's citations count in each of QS's five key Faculty Areas is weighted to account for 20% of the final citations score.
QS has conceded the presence of some data collection errors regarding citations per faculty in previous years' rankings.
One interesting issue is the difference between the Scopus and Thomson Reuters databases. For major world universities, the two systems capture more or less the same publications and citations. For less mainstream institutions, Scopus has more non-English language and smaller-circulation journals in its database. But as the papers there are less heavily cited, this can also mean fewer citations per paper for the universities that publish in them. This area has been criticized for undermining universities which do not use English as their primary language. Citations and publications in a language different from English are harder to come across. The English language is the most internationalized language and therefore is also the most popular when citing.
This part of the ranking is obtained by a similar method to the Academic Peer Review, except that it samples recruiters who hire graduates on a global or significant national scale. The numbers are smaller - 37,781 responses from over 130 countries in the 2016 Rankings - and are used to produce 10 per cent of any university's possible score. This survey was introduced in 2005 in the belief that employers track graduate quality, making this a barometer of teaching quality, a famously problematic thing to measure. University standing here is of special interest to potential students, and acknowledging this was the impetus behind the inaugural QS Graduate Employability Rankings, published in November 2015.
The final ten per cent of a university's possible score is derived from measures intended to capture their internationalism: five percent from their percentage of international students, and another five percent from their percentage of international staff. This is of interest partly because it shows whether a university is putting effort into being global, but also because it tells us whether it is taken seriously enough by students and academics around the world for them to want to be there.
In September 2015, both The Guardian and The Daily Mail referred to the QS World University Rankings as "the most authoritative of their kind". In 2016, Ben Sowter, Head of Research at the QS Intelligence Unit, was ranked in 40th position in Wonkhe's 2016 'Higher Education Power List'. The list enumerated what the organisation believed to be the 50 most influential figures in UK higher education.
Several universities in the UK and the Asia-Pacific region have commented on the rankings positively. Vice-Chancellor of New Zealand's Massey University, Professor Judith Kinnear, says that the Times Higher Education-QS ranking is a "wonderful external acknowledgement of several university attributes, including the quality of its research, research training, teaching and employability." She said the rankings are a true measure of a university's ability to fly high internationally: "The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand's Performance Based Research Fund (PBRF) measure or the Shanghai rankings." In September 2012 the British newspaper The Independent described the QS World University Rankings as being "widely recognised throughout higher education as the most trusted international tables".
Angel Calderon, Principal Advisor for Planning and Research at RMIT University and member of the QS Advisory Board, spoke positively of the QS University Rankings for Latin America, saying that the "QS Latin American University Rankings has become the annual international benchmark universities use to ascertain their relative standing in the region". He further stated that the 2016/17 edition of this ranking demonstrated improved stability.
Certain commentators have expressed concern about the use or misuse of survey data. However, QS's Intelligence Unit, responsible for compiling the rankings, state that the extent of the sample size used for their surveys mean that they are now "almost impossible to manipulate and very difficult for institutions to 'game'". They also state that "over 62,000 academic respondents contributed to our 2013 academic results, four times more than in 2010. Independent academic reviews have confirmed these results to be more than 99% reliable". Furthermore, since 2013, the number of respondents to QS's Academic Reputation Survey has increased again. Their survey now makes use of nearly 75,000 academic peer reviews, making it "to date, the world's largest aggregation of feeling in this [the global academic] community."
The QS World University Rankings have been criticised by many for placing too much emphasis on peer review, which receives 40 percent of the overall score. Some people have expressed concern about the manner in which the peer review has been carried out. In a report, Peter Wills from the University of Auckland, New Zealand wrote of the Times Higher Education-QS World University Rankings:
But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions.
However, QS state that no survey participant, academic or employer, is offered a financial incentive to respond, while no academic is able to vote for their own institution. This renders this particular criticism invalid, as it is based on two incorrect premises: (1) that academics are currently financially incentivized to participate, and (2) that conflicts of interests are created by academics being able to vote for their own institution.
Academicians previously criticized of the use of the citation database, arguing that it undervalues institutions which excel in the social sciences. Ian Diamond, former chief executive of the Economic and Social Research Council and now vice-chancellor of the University of Aberdeen and a member of the THE editorial board, wrote to Times Higher Education in 2007, saying:
The use of a citation database must have an impact because such databases do not have as wide a cover of the social sciences (or arts and humanities) as the natural sciences. Hence the low position of the London School of Economics, caused primarily by its citations score, is a result not of the output of an outstanding institution but the database and the fact that the LSE does not have the counterweight of a large natural science base.
However, in 2015, QS's introduction of faculty area normalization ensured that QS's rankings no longer conferred an undue advantage or disadvantage upon any institution based on their particular subject specialisms. Correspondingly, the London School of Economics rose from 71st in 2014 to 35th in 2015 and 37th in 2016.
Since the split from Times Higher Education in 2009, further concerns about the methodology QS uses for its rankings have been brought up by several experts.
In October 2010, criticism of the old system came from Fred L. Bookstein, Horst Seidler, Martin Fieder and Georg Winckler in the journal Scientomentrics for the unreliability of QS's methods:
Several individual indicators from the Times Higher Education Survey (THES) data base the overall score, the reported staff-to-student ratio, and the peer ratings--demonstrate unacceptably high fluctuation from year to year. The inappropriateness of the summary tabulations for assessing the majority of the "top 200" universities would be apparent purely for reason of this obvious statistical instability regardless of other grounds of criticism. There are far too many anomalies in the change scores of the various indices for them to be of use in the course of university management.
In an article for the New Statesman entitled "The QS World University Rankings are a load of old baloney", David Blanchflower, a leading labour economist, said: "This ranking is complete rubbish and nobody should place any credence in it. The results are based on an entirely flawed methodology that underweights the quality of research and overweights fluff... The QS is a flawed index and should be ignored." 
However, Martin Ince, chair of the Advisory Board for the Rankings, points out that their volatility has been reduced since 2007 by the introduction of the Z-score calculation method and that over time, the quality of QS's data gathering has improved to reduce anomalies. In addition, the academic and employer review are now so big that even modestly ranked universities receive a statistically valid number of votes. QS has published extensive data  on who the respondents are, where they are, and the subjects and industries to which the academicians and employers respectively belong.
The QS Subject Rankings have been dismissed as unreliable by Brian Leiter, who points out that programmes which are known to be high quality, and which rank highly in the Blackwell rankings (e.g., the University of Pittsburgh) fare poorly in the QS ranking for reasons that are not at all clear. However, the University of Pittsburgh was ranked in the number one position for Philosophy in the 2016 QS World University Rankings by Subject, while Rutgers University - another university that Leiter argued was given a strangely low ranking - was ranked number three in the world in the same ranking. An institution's score for each of QS's metrics can be found on the relevant ranking page, allowing those wishing to examine why an institution has finished in its final position to gain access to the scores that contributed to the overall rank.
In an article titled The Globalisation of College and University Rankings and appearing in the January/February 2012 issue of Change magazine, Philip Altbach, professor of higher education at Boston College and also a member of the THE editorial board, said: "The QS World University Rankings are the most problematical. From the beginning, the QS has relied on reputational indicators for half of its analysis ... it probably accounts for the significant variability in the QS rankings over the years. In addition, QS queries employers, introducing even more variability and unreliability into the mix. Whether the QS rankings should be taken seriously by the higher education community is questionable."
Simon Marginson, professor of higher education at University of Melbourne and a member of the THE editorial board, in the article "Improving Latin American universities' global ranking" for University World News on 10 June 2012, said: "I will not discuss the QS ranking because the methodology is not sufficiently robust to provide data valid as social science". QS's Intelligence Unit counter these criticisms by stating that "Independent academic reviews have confirmed these results to be more than 99% reliable".
The 2018 QS World University Rankings, published on June 8th 2017, was the fourteenth edition of the overall ranking. It confirmed Massachusetts Institute of Technology as the world's highest-ranked university for a sixth successive year. In doing so, MIT equalled Harvard University's record of consecutive number-one positions. 959 universities feature in the published tables, representing 84 countries.
|Massachusetts Institute of Technology||5||3||1||1||1||1||1||1|
|University of Cambridge||1||1||2||3||2||3||4||5|
|California Institute of Technology||9||12||10||10||8||5||5||4|
|University of Oxford||6||5||5||6||5||6||6||6|
|University College London||4||7||4||4||5||7||7||7|
|Imperial College London||7||6||6||5||2||8||9||8|
|University of Chicago||8||8||8||9||11||10||10||9|
|Swiss Federal Institute of Technology in Zurich||18||18||13||12||12||9||8||10|
QS also releases the QS Top 50 under 50 Ranking annually to rank universities which have been established for under 50 years. These institutions are judged based on their positions on the overall table of the previous year. From 2015, QS's "'Top 50 Under 50" ranking was expanded to include the world's top 100 institutions under 50 years of age, while in 2017 it was again expanded to include the world's top 150 universities in this cohort. In 2017, the table was topped by Nanyang Technological University of Singapore for the fourth consecutive year. The table is dominated by universities from the Asia-Pacific region, with the top six places taken by Asian institutions.
QS also ranks universities by academic discipline organized into 5 faculties, namely Arts & Humanities, Engineering & Technology, Life Sciences & Medicine, Natural Sciences and Social Sciences & Management. The methodology is based on surveying expert academics and global employers, and measuring research performance using data sourced from Elsevier's Scopus database. In the 2017 QS World University Rankings by Subject the world's best universities for the study of 46 different subjects are named. The four new subject tables added in the most recent edition are: Anatomy & Physiology, Hospitality & Leisure Management, Sports-related Subjects, and Theology.
The world's leading institution in 2017's tables in terms of most world-leading positions is Harvard University, which is number one for 15 subjects. Its longtime rankings rival, Massachusetts Institute of Technology, is number one for twelve subjects.
|Art & Humanities||Engineering & Technology||Life Sciences & Medicine||Natural Sciences [note 2]||Social Sciences|
|Archaeology||Chemical Engineering||Agriculture & Forestry||Chemistry||Accounting & Finance|
|Architecture & Built Environment||Civil & Structural Engineering||Biological Sciences||Earth & Marine Sciences||Anthropology|
|Art & Design||Computer Science & Information Systems||Dentistry||Environmental Sciences||Business & Management Studies|
|English Language & Literature||Electrical & Electronic Engineering||Medicine||Geography||Communication & Media Studies|
|History||Mechanical, Aeronautical & Manufacturing Engineering||Nursing||Materials Science||Development Studies|
|Linguistics||Mineral & Mining Engineering||Pharmacy & Pharmacology||Mathematics||Economics & Econometrics|
|Modern Languages||Psychology||Physics & Astronomy||Education & Training|
|Performing Arts||Anatomy & Physiology||Hospitality & Leisure Management|
|Theology, Divinity & Religious Studies||Politics & International Studies|
|Social Policy & Administration|
|Statistics & Operational Research|
In 2015, in an attempt to meet student demand for comparative data about the employment prospects offered by prospective or current universities, QS launched the QS Graduate Employability Rankings. The most recent instalment, released for the 2017/18 academic year, ranks 500 universities worldwide. It is led by Stanford University, and features five universities from the United States in the top 10. The unique methodology consists of five indicators, with three that do not feature in any other ranking.
|University of California, Los Angeles||1||15||2|
|The University of Sydney||14||4||4|
|Massachusetts Institute of Technology||2||2||5|
|University of Cambridge||4||5||6|
|University of Melbourne||n/a||11||7|
|University of Oxford||6||8||8|
|University of California, Berkeley||8||9||9|
In 2009, QS launched the QS Asian University Rankings or QS University Rankings: Asia in partnership with The Chosun Ilbo newspaper in Korea to rank universities in Asia independently. The eighth instalment, released for the 2016/17 academic year, ranks the 350 best universities in Asia, and is led by the National University of Singapore.
These rankings use some of the same criteria as the world rankings, but there are changed weightings and new criteria. One addition is the criterion of incoming and outgoing exchange students. Accordingly, the performance of Asian institutions in the QS World University Rankings and the QS Asian University Rankings released in the same academic year are different from each other.
|National University of Singapore||10||3||3||2||2||1||1||1|
|The University of Hong Kong||1||1||2||3||2||3||2||2|
|Nanyang Technological University||14||18||17||17||10||7||4||3|
|The Hong Kong University of Science and Technology||4||2||1||1||1||5||5||4|
|Korea Advanced Institute of Science and Technology||7||13||11||7||6||2||3||6|
|The City University of Hong Kong||18||15||15||12||12||11||9||7|
|The Chinese University of Hong Kong||2||4||5||5||7||6||6||8|
|Seoul National University||8||6||6||4||4||4||8||10|
The QS Latin American University Rankings or QS University Rankings: Latin America were launched in 2011. They use academic opinion (30%), employer opinion (20%), publications per faculty member, citations per paper, academic staff with a PhD, faculty/student ratio and web visibility (10 per cent each) as measures.
The 2016/17 edition of the QS World University Rankings: Latin America ranks the top 300 universities in the region. The Universidade de São Paulo retained its status as the region's best university.
|Pontificia Universidad Católica de Chile||2||1||3||3||1|
|Universidade Estadual de Campinas||3||3||2||2||2|
|Universidade de São Paulo||1||2||1||1||3|
|Universidad Nacional Autónoma de México||6||8||6||4||4|
|Instituto Tecnologico y de Estudios Superiores de Monterrey||7||7||9||7||5|
|Universidad de Chile||5||6||4||6||6|
|Universidade Federal do Rio de Janeiro||8||4||5||5||7|
|Universidad de los Andes||4||5||7||8||8|
|Universidad de Buenos Aires||12||19||15||11||9|
|Universidad Estadual de São Paulo||11||9||8||12||10|
|University of Science and Technology of China||6||4||6||4|
|Shanghai Jiao Tong University||6||8||6||5|
|Indian Institute of Science Bangalore||-||-||5||6|
|Lomonosov Moscow State University||3||3||4||7|
|University of São Paulo||8||7||9||10|
In 2012, QS launched the QS Best Student Cities ranking - a table designed to evaluate which cities were most likely to provide students with a high-quality student experience. Five editions of the ranking have been published thus far, with Paris taking the number-one position in four of them. The newest edition of the ranking was released on February 15, 2017. It saw Montreal take the number-one spot; in doing so, the city became the first to take the number-one position from Paris. The 2017 edition was also the first one to see the introduction of student opinion as a contributory indicator.
QS also offers universities an auditing service that provides in-depth information about institutional strengths and weaknesses. Called QS Stars, this service is separate from the QS World University Rankings. It involves a detailed look at a range of functions which mark out a modern university. The minimum score that a university can receive is one star, while truly exceptional world-leading universities can receive '5*+', or 'Five Star Plus', status. The QS Stars audit process evaluates universities according to 50 different indicators. By 2016, 16 different universities worldwide had been awarded the maximum possible Five Star Plus rating.
QS Stars ratings are derived from scores on eleven criteria. Five of these are mandatory, and institutions must choose two of four additional optional categories. They are:
Stars is an evaluation system, not a ranking. About 100 institutions had opted for the Stars evaluation as of early 2013. In 2012, fees to participate in this program were $9850 for the initial audit and an annual license fee of $6850.
The methodology differs somewhat from that used for the QS World University Rankings...
It is a remarkably stable list, relying on long-term factors such as the number of Nobel Prize-winners a university has produced, and number of articles published in Nature and Science journals. But with this narrow focus comes drawbacks. China's priority was for its universities to "catch up" on hard scientific research. So if you're looking for raw research power, it's the list for you. If you're a humanities student, or more interested in teaching quality? Not so much.
Those two, as well as Shanghai Jiao Tong University, produce the most influential international university rankings out there
There are currently three major international rankings that receive widespread commentary: The Academic World Ranking of Universities, the QS World University Rankings and the Times Higher Education Rankings.
The major international rankings have appeared in recent months -- the Academic Ranking of World Universities, the QS World University Rankings, and the Times Higher Education World University Rankings (THE).