Mechanical processing of data leads to a laughable MHRD Rankings
Mahesh Sarma, 14 Apr 2016

On 4th April 2016, the Hon’ble Minister of Human Resource Development (MHRD) announced the top educational institutions in the country with much fanfare.  The rankings were announced for Universities, Engineering, Management and Pharmacy. For a nation, where credible information on institutions is hard to come-by, a ranking exercise by the state is a welcome move. But the outcome has been quite miserable.

 

A one of a kind initiative, the National Institutional Ranking Framework (NIRF) is flawed in design, method and thus has produced questionable output.

 

Here is a set of issues for the ranking agency to consider?

 

1. In the final ranking sheet, across domains, there are few outstanding institutions, 20% good ones, about 30-35% average ones and downright odd entries make up the rest. For instance, in case of B-Schools, see the article (https://www.outlookindia.com/website/story/open-letter-to-smriti-irani/296812) by Maheshwer Peri, which illustrates this point.  The NIRF needs to answer why it is ranking any institute that came its way?

 

2. Take the university listing. We have National Assessment and Accreditation Council (NAAC), which also creates a comprehensive score after flying in team of experts who spend days looking at data and engaging with faculty.  And we have NIRF with its faceless babus and mechanical data processing. Both NIRF and NAAC finally produce a score.   Among the top 100 universities listed by NIRF, we have ten schools whose NAAC score is less than 3, five have less than 2.7.

Name

Score

Rank

NAAC score

Assam University

53.13

77

2.92

Indian Institute of Space Science and Technology

78.82

8

2.87

Gujarat University

54.01

73

2.85

Visva Bharati

76.11

11

2.82

Karunya Institute of Technology And Sciences

60.85

48

2.7

Jaypee University of Information Technology

64.21

37

2.63

Tripura University

49.89

88

2.63

Sikkim University

57.08

61

2.6

Yogi Vemana University

49.11

92

2.54

Sathyabama Institute of Science And Technology

62.46

42

2.5

And here are some of the top universities that do not find a mention in the NIRF list despite a good NAAC score.

Name of the institute

Score

Jadavpur University, Kolkata   

3.68

Indira Gandhi Institute of Development Research, Mumbai  

3.64

Dr. D.Y. Patil Vidyapeeth, Pune  

3.62

Symbiosis International University,  Pune

3.58

Shanmugha Arts, Science, Technology & Research Academy (SASTRA) University), Thanjavur

3.54

Indian Institute of Foreign Trade, New Delhi

3.53

Sumandeep Vidyapeeth, Vadodara  

3.53

University of Mysore,  Mysore    

3.47

Anna University, Chennai   

3.46

The ICFAI Foundation for Higher Education Hyderabad (Second Cycle)

3.43

3. Assuming these schools did not participate why did the NIRF end up ranking anybody and everybody who participated? One should be aware that in terms of scores both NAAC and NIRF allocate 30% each to teaching and research. Thus it stands to reason that there must be at least good correlation between these two systems even after accounting for the rest 40%. Can this wide variance be at least explained to say the least?

 

4. Why are student preferences missing in this national-level ranking? The cut-offs determine the student’s future in most admissions.  And it is an impressive indicator of an institution’s standing in the mind of the students. SRCC B.Com (Hons.) admissions demands 100 percentile so does a College of Engineering Anna University admission, while as an average college in Delhi University will give you admission even at a 65 percentile. One would have assumed the state with all its powers could have used this occasion to give the student an indicator of the standing of an institution based on its best and worst cut off.

 

5. Most recruiters use the cut off as a proxy to assess the quality of a co-hort in a school and even take recruiting decisions based on the same. This is primarily the reason why a new IIM housed in a rented two-story building in engineering colleges still gets the recruiters because he or she is assured of a 90 percentile there.  The brand does rest of the magic.  But for some inexplicable reason this most crucial parameter is left out of reckoning. The result is that odd institutions that cannot even fill 15% of their seats find a place in the NATIONAL RANK SHEET.

 

6. Let us take the first component of NIRF methodology “Teaching and Learning”. This accounts for 30% of the total score. In addition to inane   issues like investments in library etc, the core factors considered are Student/Faculty ratio, PhD/ non-PhD ratio and experience of the faculty. And how does NIRF count experience? – It is based on the age of the teacher minus 15 years, with 15 years being the ideal years of experience. The net result is that any university with maximum number of professors will get full score on that count!

 

7. What is the result of this amazing methodology? In the top 15 universities 4 are agriculture/veterinary universities (No offence meant to them).  And in management IMI Kolkata is slightly better than   IMI Delhi - the parent school.  IMI must with draw from the ranking forthwith!

 

8. And 5 out of the 15 are very specialized small schools, which are either INIs or Deemed Universities. Both will have good S/F ratio and PhD/non PhD Ratio.

 

9. The point is as a ranking agency NIRF must move beyond investments, and number of sq. feet of buildings. It reminds one of license quota raj please. Neither Teaching innovations has got mention here   nor   the curriculum development has been ascribed any score. You, as the government have the ability and power to go beyond these numbers but you didn’t. You are the government. Get better and smarter data. Or simply move out of the ranking exercise.

 

10. Let us get to the next important rating component. Research output. A fascinating tool that is used world over to rate good schools. But we are a great nation. So what we do? Instead of one index we use three, Scopus (I will ignore the typo), Web of Science and Google Scholar. Never mind, the journals are duplicated and it would be ridiculous to average them out. We did that too with complicated weights that vary for both citations and publications and again vary for each domain. Who can argue with the govt.?   So look for journals indexed in Scopus, web of science and Google scholar. One paper will get you three points. Ache din sabke aayenge!

 

11. This has resulted in odd names like Goa University, North Maharashtra and Gauwhati Univesity scoring above 75% in research.  They just don’t publish enough to be even in top 50 institutions. It becomes much worse when it comes to B-Schools, beyond the top few schools most have published in single digits and ended up with scores in two digits. Either NIRF must explain why did they award such ridiculous scores or apologize to the nation!

 

12. Let us now move on to the next parameter Outcomes.   Schools that place 10% of their total strength at average salaries varying from 20,000 to 25,000 per month are in the top 50 school listings. But then it is difficult to blame NIRF. Outcomes are notorious to measure.  So like the drunkard who searches for the lost needle wherever there is light instead of wherever he lost it, NIRF decides – “We will rank based on what we can measure”. So, for universities, the measure is the pass percentage (50%) and pass percentage in public examinations (50%). So what if most private universities and half the public universities in the country would be out of the reckoning in this parameter?

 

13. And we are not even beginning to talk about rating affiliating universities like DU. That too with colleges as diverse as St. Stephen’s and ARSD. So for a Delhi University, NIRF will consider constituent schools, affiliated colleges or autonomous schools or all of them?

 

14. Or take a VTU or Anna University each with about 300- 500 affiliated engineering colleges? How will the outcomes be measured? Can we at least start a discussion, NIRF?

 

15. Now comes the best part - peer perception. This is the tool which most ranking agencies use to push or pull a difficult customer. The database is opaque (composition, level of pollsters, geography, education-qualification) nothing is available in the public domain even for globally respected rankings like THE or QS. Which is why ARWU sometimes throws up very diverse names vis-a-vis the other two. The later one is completely objective.

 

16. Now NIRF attempts something different. It uses peer perception. For B-Schools, it also provides   data on public perception, but not for others. So much for data transparency. But interestingly 36 out of the top 100 players in universities get a ‘0’ score in peer perception. Should they at all be in the top 100 listing?

 

17. But the scoring gets much more complicated. Those universities with 0 peer score in universities ranking also get   ‘0’ marks. But when it comes to B-Schools, 34 out of top 50 have ‘0’ peer score, but all of them get marks between 24 and 46. Is it because most schools have no standing amongst peers in NIRF listings and that is too much to stomach for the rankers? So it ended up giving some score to all of them.

 

18. Lastly, if quality improvement is NIRF’s goal why create this artificial distinction between teaching + research schools (Type A) and teaching only (Type B). This is all the more laughable since all that differs between the teaching and teaching+ research school is the weightage you allocate for research in scoring.  Whom are you kidding and why?

 

This is a sample list of issues. As we dig the data further, we could find more issues. I do sympathize with NIRF. Most issues that I have identified are also issues that any serious ranking organization would face including Careers360. But we make efforts to make the list representative, by going beyond mechanical processing of data supplied. We identify rogues and remove them. We apologize when there is an error. We work closely with stakeholders. And we definitely do not go with perception. But all that will take a bit of humility and the willingness to look beyond IITs and IIMs. The country needs it. Will the real NIRF stand up and be counted?

And in the meantime please withdraw the current list. It is a disgrace.

Also Read -
MHRD India Rankings 2016
MHRD Rankings 2016: Top Engineering Colleges in India
MHRD Rankings 2016: Top Universities in India
MHRD Rankings 2016: Top Management Colleges in India
MHRD Rankings 2016: Top Pharmacy Colleges in India


Stay tuned to university.careers360.com for more updates on MHRD Rankings 2016

Admissions Open Now

Manipal Institute of Technology
Manipal Institute of Technology
Apply
Manav Rachna University Admissions 2019
Manav Rachna University Admissions 2019
Apply
UPES - School of Engineering
UPES - School of Engineering
Apply
Manav Rachna - MRIIRS Admissions 2019
Manav Rachna - MRIIRS Admissions 2019
Apply
MICA - PGDM Admissions 2019 Media
MICA - PGDM Admissions 2019 Media
Apply
View All Application Forms
Never miss exam and college updates

Download the Careers360 App on your Android phone

Regular exam updates, QnA, Predictors, College Applications & E-books now on your Mobile

Careers360 App
150M+ Students
24,000+ Colleges
500+ Exams
1500+ E-books