World Library  
Flag as Inappropriate
Email this Article

QS World University Rankings

Article Id: WHEBN0025057928
Reproduction Date:

Title: QS World University Rankings  
Author: World Heritage Encyclopedia
Language: English
Subject: RMIT University, Delft University of Technology, Curtin University, Tokyo Institute of Technology, Trinity College, Dublin
Collection: University and College Rankings
Publisher: World Heritage Encyclopedia

QS World University Rankings

QS World University Rankings
Editor Danny Byrne
Categories Higher education
Frequency Annual
Publisher QS Quacquarelli Symonds Limited
Country United Kingdom
Language English
Website QS World University Rankings

QS World University Rankings are annual university rankings published by British Quacquarelli Symonds (QS). The publisher originally released its rankings in publication with Times Higher Education (THE) from 2004 to 2009 as the THE-QS World University Rankings, but such collaboration was terminated in 2010, with the resumption of publishing by QS using the pre-existing methodology and new cooperation between THE and Thomson Reuters releasing Times Higher Education World University Rankings. Today, the QS rankings comprise both world and regional league tables which are independent of and different from each other owing to differences in the criteria and weightings used to generate them.[1] The publication is one of the three most influential and widely observed international university rankings, alongside the Times Higher Education World University Rankings and the Academic Ranking of World Universities.[2][3][4]


  • History 1
  • World rankings 2
    • Methodology 2.1
      • Academic peer review (40%) 2.1.1
      • Faculty student ratio (20%) 2.1.2
      • Citations per faculty (20%) 2.1.3
      • Recruiter review (10%) 2.1.4
      • International orientation (10%) 2.1.5
      • Aggregation 2.1.6
    • Overall rankings 2.2
    • Rankings by faculty and subject 2.3
    • QS Top 50 under 50 2.4
  • Regional rankings 3
    • QS Asian University Rankings 3.1
    • QS Latin American University Rankings 3.2
    • QS BRICS University Rankings 3.3
  • QS Stars 4
  • Commentary 5
    • General criticisms 5.1
    • Subject rankings reliability 5.2
  • Notes and references 6
  • External links 7


The need for an international ranking of universities was highlighted in December 2003 in Richard Lambert’s review of university-industry collaboration in Britain[5] for HM Treasury, the finance ministry of the United Kingdom. Amongst its recommendations were world university rankings, which Lambert said would help the UK to gauge the global standing of its universities.

The idea for the rankings was credited in Ben Wildavsky's book, The Great Brain Race: How Global Universities are Reshaping the World,[6] to then-editor of Times Higher Education (THE), John O'Leary. THE chose to partner with educational and careers advice company Quacquarelli Symonds (QS) to supply the data, appointing Martin Ince,[7] formerly deputy editor and later a contractor to THE, to manage the project.

Between 2004 and 2009, QS produced the rankings in partnership with THE. In 2009, THE announced they would produce their own rankings, the Times Higher Education World University Rankings, in partnership with Thomson Reuters. THE cited a weakness in the methodology of the original rankings,[8] as well as a perceived favoritism in the existing methodology for science over the humanities,[9] as one of the key reasons for the decision to split with QS.

QS retained the intellectual property in the Rankings and the methodology used to compile them and continues to produce the rankings, now called the QS World University Rankings.[10] THE created a new methodology with Thomson Reuters, published as the Times Higher Education World University Rankings in September 2010.

World rankings


QS publishes the rankings results in key media around the world, including US News & World Report in the United States and Chosun Ilbo in Korea. The first rankings produced by QS independently of THE, and using QS's consistent and original methodology, were released on September 8, 2010, with the second appearing on September 6, 2011.

QS tried to design its rankings to look at a broad range of university activity.

Academic peer review (40%)

The most controversial part of the QS World University Rankings is their use of an opinion survey referred to as the Academic Peer Review. Using a combination of purchased mailing lists and applications and suggestions, this survey asks active academicians across the world about the top universities in fields they know about. QS has published the job titles and geographical distribution of the participants.[11]

The 2011 rankings made use of responses from 33,744 people from over 140 nations in its Academic Peer Review, including votes from the previous two years rolled forward provided there was no more recent information available from the same individual. Participants can nominate up to 30 universities but are not able to vote for their own. They tend to nominate a median of about 20, which means that this survey includes over 500,000 data points.[11]

In 2004, when the rankings first appeared, academic peer review accounted for half of a university's possible score. In 2005, its share was cut to 40 per cent because of the introduction of the Recruiter Review.

Faculty student ratio (20%)

This indicator accounts for 20 per cent of a university’s possible score in the rankings. It is a classic measure used in various ranking systems as a surrogate for teaching commitment, but QS has admitted that it is less than satisfactory.[12]

Citations per faculty (20%)

Citations of published research are among the most widely used inputs to national and global university rankings. The QS World University Rankings used citations data from Thomson (now Thomson Reuters) from 2004 to 2007, and since then uses data from Scopus, part of Elsevier. The total number of citations for a five-year period is divided by the number of academicians in a university to yield the score for this measure, which accounts for 20 per cent of a university’s possible score in the Rankings.

QS has explained that it uses this approach, rather than the citations per paper preferred for other systems, because it reduces the effect of biomedical science on the overall picture – bio-medicine has a ferocious “publish or perish” culture. Instead QS attempts to measure the density of research-active staff at each institution. But issues still remain about the use of citations in ranking systems, especially the fact that the arts and humanities generate comparatively few citations.[13]

QS has conceded the presence of some data collection errors regarding citations per faculty in previous years' rankings.[14]

One interesting issue is the difference between the Scopus and Thomson Reuters databases. For major world universities, the two systems capture more or less the same publications and citations. For less mainstream institutions, Scopus has more non-English language and smaller-circulation journals in its database. But as the papers there are less heavily cited, this can also mean fewer citations per paper for the universities that publish in them.[13] This area has been criticized for undermining universities which do not use English as their primary language.[15] Citations and publications in a language different from English are harder to come across. The English language is the most internationalized language and therefore the most popular when citing.

Recruiter review (10%)

This part of the ranking is obtained by a similar method to the Academic Peer Review, except that it samples recruiters who hire graduates on a global or significant national scale. The numbers are smaller – 16,875 responses from over 130 countries in the 2011 Rankings – and are used to produce 10 per cent of any university’s possible score. This survey was introduced in 2005 in the belief that employers track graduate quality, making this a barometer of teaching quality, a famously problematic thing to measure. University standing here is of special interest to potential students.[16]

International orientation (10%)

The final ten per cent of a university’s possible score is derived from measures intended to capture their internationalism: five percent from their percentage of international students, and another five percent from their percentage of international staff. This is of interest partly because it shows whether a university is putting effort into being global, but also because it tells us whether it is taken seriously enough by students and academics around the world for them to want to be there.[17]


The data are aggregated into columns according to its Z score, an indicator of how far removed any institution is from the average. Between 2004 and 2007 a different system was used whereby the top university for any measure was scaled as 100 and the others received a score reflecting their comparative performance. According to QS, this method was dropped because it gives too much weight to some exceptional outliers, such as the very high faculty/student ratio of the California Institute of Technology. In 2006, the last year before the Z score system was introduced, Caltech was top of the citations per faculty score, receiving 100 on this indicator, because of its highly research and science-oriented approach. The next two institutions on this measure, Harvard and Stanford, each scored 55. In other words, 45 per cent of the possible difference between all the world's universities was between the top university and the next one (in fact two) on the list, leaving every other university on Earth to fight over the remaining 55 per cent.

Likewise in 2005, Harvard was top university and MIT was second with 86.9, so that 13 per cent of the total difference between all the world's universities was between first and second place. In 2011, the University of Cambridge was top and the second institution, Harvard, got 99.34. So the Z score system allows the full range of available difference to be used in a more informative way.

Overall rankings

QS World University Rankings—Top 50
2014/15[18] 2013/14[19] 2012/13[20] 2011/12[21] 2010/11[22] Institution Region
1 1 1 3 5 Massachusetts Institute of Technology  United States
2 3 2 5 1 University of Cambridge  United Kingdom
2 5 6 6 7 Imperial College London  United Kingdom
4 2 3 2 2 Harvard University  United States
5 6 5 1 6 University of Oxford  United Kingdom
5 4 4 7 4 University College London  United Kingdom
7 7 15 11 13 Stanford University  United States
8 10 10 12 9 California Institute of Technology  United States
9 10 9 13 10 Princeton University  United States
10 8 7 4 3 Yale University  United States
11 9 8 8 8 University of Chicago  United States
12 12 13 18 18 Swiss Federal Institute of Technology in Zurich (ETH Zurich)   Switzerland
13 13 12 9 12 University of Pennsylvania  United States
14 14 11 10 11 Columbia University  United States
14 16 16 16 17 Johns Hopkins University  United States
16 19 26 27 21 King's College London  United Kingdom
17 17 21 20 22 University of Edinburgh  United Kingdom
17 19 29 35 32 Swiss Federal Institute of Technology in Lausanne (EPFL)   Switzerland
19 15 14 15 16 Cornell University  United States
20 17 19 23 29 University of Toronto  Canada
21 21 18 17 19 McGill University  Canada
22 24 25 28 31 National University of Singapore  Singapore
23 22 17 14 15 University of Michigan  United States
24 28 34 33 33 Ecole Normale Supérieure  France
25 27 24 26 20 Australian National University  Australia
25 23 20 19 14 Duke University  United States
27 25 22 21 28 University of California, Berkeley  United States
28 26 23 22 23 The University of Hong Kong  Hong Kong
29 30 28 30 27 University of Bristol  United Kingdom
30 33 32 29 30 The University of Manchester  United Kingdom
31 32 30 25 24 The University of Tokyo  Japan
31 35 37 42 50 Seoul National University  South Korea
33 31 36 31 38 The University of Melbourne  Australia
34 29 27 24 26 Northwestern University  United States
35 41 41 36 36 Ecole Polytechnique  France
36 35 35 32 25 Kyoto University  Japan
37 40 31 34 35 University of California, Los Angeles  United States
37 38 39 38 37 The University of Sydney  Australia
39 41 47 58 74 Nanyang Technological University  Singapore
40 34 33 40 40 The Hong Kong University of Science and Technology  Hong Kong
41 44 43 44 41 New York University  United States
41 37 38 41 48 University of Wisconsin-Madison  United States
43 49 45 51 44 University of British Columbia  Canada
43 43 46 48 43 The University of Queensland  Australia
45 39 40 37 42 The Chinese University of Hong Kong  Hong Kong
46 45 51 52 45 University of Copenhagen  Denmark
47 48 48 47 54 Tsinghua University  China
48 52 52 39 39 University of New South Wales  Australia
49 50 55 53 51 Ruprecht-Karls-Universität Heidelberg  Germany
50 58 62 63 49 University of Amsterdam  Netherlands
THE–QS World University Rankings, 2004
THE–QS World University Rankings, 2005
THE–QS World University Rankings, 2006
THE–QS World University Rankings, 2007
THE–QS World University Rankings, 2008
THE–QS World University Rankings, 2009

Rankings by faculty and subject

QS also ranks universities by Arts & Humanities, Engineering & Technology, Life Sciences& Medicine, Natural Sciences and Social Sciences & Management. These annual rankings are drawn up on the basis of academic opinion, recruiter opinion and citations.
QS University Faculty and Subject Rankings' categories[23][24]
Art & Humanities Engineering & Technology Life Sciences & Medicine Natural Sciences Social Sciences
Discipline Discipline Discipline Discipline Discipline
Philosophy Computer Science
Information Systems
Medicine Physics
Operational Research
Modern Languages Chemical Engineering Biological Sciences Mathematics Sociology
Geography Civil
Structural Engineering
Psychology Environmental Sciences Politics
International Studies
History Electrical
Electronic Engineering
Marine Sciences
Linguistics Mechanical, Aeronautical
Manufacturing Engineering
Chemistry Economics
English Language
Materials Sciences Accounting
Media Studies

QS Top 50 under 50

QS releases a list of QS Top 50 under 50 annually to rank those universities which have established for not more than 50 years. This league table is based on their position in the QS World University Rankings of the previous year.[25]

Regional rankings

QS Asian University Rankings

In 2009, QS launched the QS Asian University Rankings or QS University Rankings: Asia in partnership with The Chosun Ilbo newspaper in Korea to rank universities in Asia independently.

These rankings use some of the same criteria as the world rankings, but there are changed weightings and new criteria. One addition is the criterion of incoming and outgoing exchange students. Accordingly, the QS World University Rankings and the QS Asian University Rankings released in the same academic year are different from each other.[1] For example, The University of Hong Kong being 22nd and 23rd worldwide was regarded as the best Asian institution by the QS World University Rankings (2011 and 2012),[20][21] while The Hong Kong University of Science and Technology had topped the tables of the QS Asian University Rankings simultaneously.[26][27]

QS Asian University Rankings — Top 50
2014[28] 2013[29] 2012[26] 2011[27] 2010[30] 2009[31] Institution Region
1 2 2 3 3 10 National University of Singapore  Singapore
2 6 7 11 13 7 Korea Advanced Institute of Science and Technology  South Korea
3 2 3 2 1 1 The University of Hong Kong  Hong Kong
4 4 4 6 6 8 Seoul National University  South Korea
5 1 1 1 2 4 The Hong Kong University of Science and Technology  Hong Kong
6 7 5 5 4 2 The Chinese University of Hong Kong  Hong Kong
7 10 17 17 18 14 Nanyang Technological University  Singapore
8 5 6 13 12 10 Peking University  China
9 7 9 12 14 17 Pohang University of Science and Technology  South Korea
10 9 8 4 5 3 The University of Tokyo  Japan
11 12 12 15 15 18 City University of Hong Kong  Hong Kong
12 10 10 7 8 5 Kyoto University  Japan
13 15 11 8 7 6 Osaka University  Japan
14 14 15 16 16 15 Tsinghua University  China
15 13 13 9 11 9 Tokyo Institute of Technology  Japan
16 16 16 18 19 25 Yonsei University  South Korea
17 21 24 27 43 44 Sungkyunkwan University  South Korea
18 19 21 26 29 33 Korea University  South Korea
18 17 14 9 9 13 Tohoku University  Japan
20 18 18 14 10 12 Nagoya University  Japan
21 22 20 21 21 22 National Taiwan University  Taiwan
22 23 19 21 24 26 Fudan University  China
23 24 23 20 22 20 Hokkaido University  Japan
24 20 22 18 17 15 Kyushu University  Japan
25 26 27 24 25 24 University of Science and Technology of China  China
26 29 28 29 27 27 Nanjing University  China
27 25 26 30 30 38 The Hong Kong Polytechnic University  Hong Kong
28 27 29 33 34 29 Shanghai Jiao Tong University  China
29 36 33 44 49 46 Hanyang University  South Korea
29 30 49 52 71 74 National Chiao Tung University  Taiwan
31 28 28 27 32 32 Zhejiang University  China
32 33 35 39 42 39 Universiti Malaya  Malaysia
33 31 31 31 34 40 National Tsing Hua University  Taiwan
34 34 32 23 20 19 University of Tsukuba  Japan
35 32 30 24 23 20 Keio University  Japan
36 37 37 32 31 43 National Cheng Kung University  Taiwan
37 35 41 42 62 57 Kyung Hee University  South Korea
38 38 36 37 39 36 Indian Institute of Technology Delhi  India
39 40 40 45 48 42 Ewha Womans University  South Korea
40 42 38 34 28 30 Mahidol University  Thailand
41 39 34 38 36 30 Indian Institute of Technology Bombay  India
42 46 45 64 89 110 Beijing Normal University  China
43 41 39 35 26 23 Kobe University  Japan
44 44 42 46 39 37 Waseda University  Japan
45 43 48 49 45 73 Hong Kong Baptist University  Hong Kong
46 50 64 89 na na Taipei Medical University  Taiwan
47 47 44 41 38 28 Hiroshima University  Japan
48 48 43 47 44 35 Chulalongkorn University  Thailand
49 45 50 40 41 47 National Yang Ming University  Taiwan
50 62 67 73 128 121 Nankai University  China

QS Latin American University Rankings

The QS Latin American University Rankings or QS University Rankings: Latin America were launched in 2011. They use academic opinion (30%), employer opinion (20%), publications per faculty member, citations per paper, academic staff with a PhD, faculty/student ratio and web visibility (10 per cent each) as measures.[32]

QS BRICS University Rankings

QS collaborates with the Russian News to launch the third regional rankings regarding the BRICS countries (Brazil, Russia, India, China and South Africa), known as the QS University Rankings: BRICS. This ranking adopts 8 indicators, which are derived from but different in weightings to those of the world rankings, to select the top 100 higher learning institutions in these regions. The BRICS ranking only takes mainland China's universities into account, excluding other Greater China places' such as those in Hong Kong and Taiwan.[33]

QS University Rankings: BRICS — Top 50[33]
2013 Institution Region
1 Tsinghua University  China
2 Peking University  China
3 Lomonosov Moscow State University  Russia
4 Fudan University  China
5 Nanjing University  China
6 University of Science and Technology of China  China
6 Shanghai Jiao Tong University  China
8 Universidade de São Paulo  Brazil
9 Zhejiang University  China
10 Universidade Estadual de Campinas  Brazil
11 University of Cape Town  South Africa
12 Beijing Normal University  China
13 Indian Institute of Technology Delhi  India
14 Saint-Petersburg State University  Russia
15 Indian Institute of Technology Bombay  India
16 Indian Institute of Technology Madras  India
17 Indian Institute of Technology Kanpur  India
18 Indian Institute of Technology Kharagpur  India
19 Universidade Federal do Rio de Janeiro  Brazil
20 Sun Yat-sen University  China
21 Xi'an Jiaotong University  China
22 Novosibirsk State University  Russia
23 Harbin Institute of Technology  China
23 Nankai University  China
25 Universidade Estadual Paulista "Júlio de Mesquita Filho"  Brazil
26 Wuhan University  China
27 Tongji University  China
28 Shanghai University  China
29 Universidade Federal de São Paulo  Brazil
30 Stellenbosch University  South Africa
31 University of The Witwatersrand  South Africa
32 Beihang University  China
33 Bauman Moscow State Technical University  Russia
34 Indian Institute of Technology Roorkee  India
35 Universidade Federal de Minas Gerais  Brazil
36 Pontificia Universidade Católica de São Paulo  Brazil
37 Moscow State Institute of International Relations  Russia
38 Universidade Federal do Rio Grande Do Sul  Brazil
39 Beijing Institute of Technology  China
40 Xiamen University  China
41 Pontificia Universidade Católica do Rio de Janeiro  Brazil
42 Renmin University of China  China
43 University of Pretoria  South Africa
43 Tianjin University  China
43 Beijing Jiaotong University  China
43 Universidade Federal de São Carlos  Brazil
47 Saint Petersburg State Polytechnical University  Russia
48 Universidade de Brasilia  Brazil
48 Huazhong University of Science and Technology  China
50 National Research University – Higher School of Economics  Russia

QS Stars

QS also offers universities a way of seeing their own strengths and weaknesses in depth. Called QS Stars, this service is separate from the QS World University Rankings. It involves a detailed look at a range of functions which mark out a modern university. Universities can get from one star to five, or Five Star Plus for the truly exceptional.

QS Stars ratings are derived from scores on eight criteria. They are: ● Research Quality ● Teaching Quality ● Graduate Employability ● University Infrastructure ● Internationalisation ● Innovation and knowledge transfer ● Third mission activity, measuring areas of social and civic engagement ● Special criteria for specific subjects

Stars is an evaluation system, not a ranking. About 100 institutions had opted for the Stars evaluation as of early 2013. In 2012, fees to participate in this program were $9850 for the initial audit and an annual license fee of $6850.[34]


Several universities in the UK and the Asia-Pacific region have commented on the rankings positively. Vice-Chancellor of New Zealand's Massey University, Professor Judith Kinnear, says that the Times Higher Education-QS ranking is a "wonderful external acknowledgement of several University attributes, including the quality of its research, research training, teaching and employability." She said the rankings are a true measure of a university's ability to fly high internationally: "The Times Higher Education ranking provides a rather more and more sophisticated, robust and well rounded measure of international and national ranking than either New Zealand's Performance Based Research Fund (PBRF) measure or the Shanghai rankings."[35] In September 2012 the British newspaper The Independent described the QS World University Rankings as being "widely recognised throughout higher education as the most trusted international tables".[36]

Martin Ince,[37] chair of the Advisory Board for the Rankings, points out that their volatility has been reduced since 2007 by the introduction of the Z-score calculation method and that over time, the quality of QS's data gathering has improved to reduce anomalies. In addition, the academic and employer review are now so big that even modestly ranked universities receive a statistically valid number of votes. QS has published extensive data [38] on who the respondents are, where they are, and the subjects and industries to which the academicians and employers respectively belong.

General criticisms

Many are concerned with the use or misuse of survey data.

Since the split from Times Higher Education, further concerns about the methodology QS uses for its rankings have been brought up by several experts. Simon Marginson, professor of higher education at University of Melbourne and a member of the THE editorial board, in the article "Improving Latin American universities' global ranking" for University World News on 10 June 2012, said: "I will not discuss the QS ranking because the methodology is not sufficiently robust to provide data valid as social science." [39]

In an article for the New Statesman entitled "The QS World University Rankings are a load of old baloney", David Blanchflower, a leading labour economist, said: "This ranking is complete rubbish and nobody should place any credence in it. The results are based on an entirely flawed methodology that underweights the quality of research and overweights fluff... The QS is a flawed index and should be ignored." [40]

In an article titled The Globalisation of College and University Rankings and appearing in the January/February 2012 issue of Change magazine, Philip Altbach, professor of higher education at Boston College and also a member of the THE editorial board, said: “The QS World University Rankings are the most problematical. From the beginning, the QS has relied on reputational indicators for half of its analysis … it probably accounts for the significant variability in the QS rankings over the years. In addition, QS queries employers, introducing even more variability and unreliability into the mix. Whether the QS rankings should be taken seriously by the higher education community is questionable."[41]

The QS World University Rankings have been criticised by many for placing too much emphasis on peer review, which receives 40 percent of the overall score. Some people have expressed concern about the manner in which the peer review has been carried out.[42] In a report,[43] Peter Wills from the University of Auckland, New Zealand wrote of the Times Higher Education-QS World University Rankings:

But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions.

QS points out that no survey participant, academic or employer, has been offered a financial incentive to respondents. And academics cannot vote for their own institution.

THES-QS introduced several changes in methodology in 2007 which were aimed at addressing these criticisms,[44] the ranking has continued to attract criticisms. In an article[45] in the peer-reviewed BMC Medicine authored by several scientists from the US and Greece, it was pointed out:

If properly performed, most scientists would consider peer review to have very good construct validity; many may even consider it the gold standard for appraising excellence. However, even peers need some standardized input data to peer review. The Times simply asks each expert to list the 30 universities they regard as top institutions of their area without offering input data on any performance indicators. Research products may occasionally be more visible to outsiders, but it is unlikely that any expert possesses a global view of the inner workings of teaching at institutions worldwide. Moreover, the expert selection process of The Times is entirely unclear. The survey response rate among the selected experts was only <1% in 2006 (1,600 of 190,000 contacted). In the absence of any guarantee for protection from selection biases, measurement validity can be very problematic.

Alex Usher, vice president of Higher Education Strategy Associates in Canada, commented:

Most people in the rankings business think that the main problem with The Times is the opaque way it constructs its sample for its reputational rankings - a not-unimportant question given that reputation makes up 50% of the sample. Moreover, this year's switch from using raw reputation scores to using normalized Z-scores has really shaken things up at the top-end of the rankings by reducing the advantage held by really top universities - University of British Columbia (UBC) for instance, is now functionally equivalent to Harvard in the Peer Review score, which, no disrespect to UBC, is ludicrous. I'll be honest and say that at the moment the THES Rankings are an inferior product to the Shanghai Jiao Tong’s Academic Ranking of World Universities.

Academicians have also been critical of the use of the citation database, arguing that it undervalues institutions which excel in the social sciences. Ian Diamond, former chief executive of the Economic and Social Research Council and now vice-chancellor of the University of Aberdeen and a member of the THE editorial board, wrote to Times Higher Education in 2007, saying:[46]

The use of a citation database must have an impact because such databases do not have as wide a cover of the social sciences (or arts and humanities) as the natural sciences. Hence the low position of the London School of Economics, caused primarily by its citations score, is a result not of the output of an outstanding institution but the database and the fact that the LSE does not have the counterweight of a large natural science base.

The most recent criticism of the old system came from Fred L. Bookstein, Horst Seidler, Martin Fieder and Georg Winckler in the journal Scientomentrics for the unreliability of QS's methods:

Several individual indicators from the Times Higher Education Survey (THES) data base the overall score, the reported staff-to-student ratio, and the peer ratings—demonstrate unacceptably high fluctuation from year to year. The inappropriateness of the summary tabulations for assessing the majority of the “top 200” universities would be apparent purely for reason of this obvious statistical instability regardless of other grounds of criticism. There are far too many anomalies in the change scores of the various indices for them to be of use in the course of university management.[47]

Subject rankings reliability

The QS subject rankings have been dismissed as unreliable by some critics, including most notably Brian Leiter, who points out that programmes which are known to be high quality, and which rank highly in the Blackwell rankings (e.g., the University of Pittsburgh) fare poorly in the QS ranking for reasons that are not at all clear.[48][49]

In other areas, QS has highly ranked programmes which do not exist,[50] as in Geography, in which 5 of the top 10 did not actually have graduate programmes in geography. In Linguistics, the QS rankings are entirely out of step with the most recent NRC rankings; NRC ranks the doctoral programmes of the University of Massachusetts Amherst and the University of Maryland at College Park among the very best in the U.S.A. (tied for #3 in S-Rank), while QS ranks them 29th and 49th in the world, respectively.[51]

Notes and references

  1. ^ a b "Asian University Rankings - QS Asian University Rankings vs. QS World University Rankings™". The methodology differs somewhat from that used for the QS World University Rankings... 
  2. ^ Ariel Zirulnick. "New world university ranking puts Harvard back on top".  
  3. ^ "We're fighting above our weight when it comes to uni rankings".  
  4. ^ Indira Samarasekera and Carl Amrhein. "Top schools don't always get top marks".  
  5. ^ Lambert Review of Business-University Collaboration
  6. ^ Princeton University Press, 2010
  7. ^ Martin Ince Communications
  8. ^ Mroz, Ann. "Leader: Only the best for the best". Times Higher Education. Retrieved 2010-09-16. 
  9. ^ Baty, Phil (2010-09-10). "Views: Ranking Confession". Inside Higher Ed. Retrieved 2010-09-16. 
  10. ^ Labi, Aisha (2010-09-15). "Times Higher Education Releases New Rankings, but Will They Appease Skeptics?". The Chronicle of Higher Education (London, UK). Retrieved 2010-09-16. 
  11. ^ a b "2011 Academic Survey Responses". Retrieved 12 September 2013. 
  12. ^ QS Intelligence Unit | Faculty Student Ratio. Retrieved on 2013-08-12.
  13. ^ a b QS Intelligence Unit | Citations per Faculty. Retrieved on 2013-08-12.
  14. ^ University Ranking Watch
  15. ^ "Global university rankings and their impact,". "European University Association". Retrieved 3, September, 2012
  16. ^ QS Intelligence Unit | Employer Reputation. Retrieved on 2013-08-12.
  17. ^ QS Intelligence Unit | International Indicators. Retrieved on 2013-08-12.
  18. ^ "QS World University Rankings (2014)". 
  19. ^ "QS World University Rankings (2013)". 
  20. ^ a b "QS World University Rankings (2012)". 
  21. ^ a b "QS World University Rankings (2011)". 
  22. ^ "QS World University Rankings (2010)". 
  23. ^ "A new approach to faculty areas". Quacquarelli Symonds. Retrieved 12 August 2014. 
  24. ^ "QS University Subject Rankings". Quacquarelli Symonds. Retrieved 12 August 2014. 
  25. ^ "QS Top 50 under 50". Quacquarelli Symonds. Retrieved 2013-07-07. 
  26. ^ a b "QS Asian University Rankings (2012)". 
  27. ^ a b "QS Asian University Rankings (2011)". 
  28. ^ "QS Asian University Rankings (2014)". 
  29. ^ "QS Asian University Rankings (2013)". 
  30. ^ "QS Asian University Rankings (2010)". 
  31. ^ "QS Asian University Rankings (2009)". 
  32. ^ "Methodology (QS University Rankings – Latin America)". Quacquarelli Symonds. Retrieved 12 August 2014. 
  33. ^ a b "QS University Rankings: BRICS". Quacquarelli Symonds. 2013-12-17. Retrieved 2013-12-17. 
  34. ^ "Ratings at a Price for Smaller Universities". The New York Times. 30 December 2012. Retrieved 10 September 2013. 
  35. ^ Flying high internationally
  36. ^ "Cambridge loses top spot to Massachusetts Institute of Technology". The Independent. 11 September 2012. Retrieved 11 September 2012. 
  37. ^ Martin Ince Communications Limited
  38. ^ QS World University Rankings | QS Intelligence Unit
  39. ^ Improving Latin American universities' global ranking - University World News
  40. ^ The QS World University Rankings are a load of old baloney
  41. ^ Change Magazine - January-February 2012
  42. ^ Holmes, Richard (2006-09-05). "So That's how They Did It". Retrieved 2010-09-16. 
  43. ^ Response to Review of Strategic Plan by Peter Wills
  44. ^ Sowter, Ben (1 November 2007). THES – QS World University Rankings 2007 - Basic explanation of key enhancements in methodology for 2007"
  45. ^ "" (PDF). Retrieved 2010-09-16. 
  46. ^ "Social sciences lose 1". 2007-11-16. Retrieved 2010-09-16. 
  47. ^ "Scientometrics, Volume 85, Number 1". SpringerLink. Retrieved 2010-09-16. 
  48. ^ Leiter Reports: A Philosophy Blog: Guardian and "QS Rankings" Definitively Prove the Existence of the "Halo Effect". (2011-06-05). Retrieved on 2013-08-12.
  49. ^ Leiter Reports: A Philosophy Blog: The QS Subject Rankings are Complete Garbage. (2012-07-30). Retrieved on 2013-08-12.
  50. ^ Sedghi, Ami. (2011-06-03) The world's top 100 universities ranked for arts and humanities disciplines | News. Retrieved on 2013-08-12.
  51. ^ NRC Rankings Overview: Linguistics - Faculty - The Chronicle of Higher Education. (2010-09-30). Retrieved on 2013-08-12.

External links

  • Quacquarelli Symonds official website
  • Full rankings 2011/12
  • QS Intelligence Unit Blog — blog on rankings and higher education from the team that compiles the QS World University Rankings
  • Interactive maps comparing the QS World University Rankings with the Academic Ranking of World Universities and Times Higher Education rankings
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from World eBook Library are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.