Followers

Wednesday, October 06, 2021

NIRF ranking does not give full picture of higher education in India

 

S S Mantha, Ashok Thakur writes: It is based on limited parameters and seems to be committing the same sin that the global rankings systems were once accused of — a one-size-fits-all approach.


The National Institutional Ranking Framework (NIRF) of universities and colleges for the year 2021 was released on September 9 by the Ministry of Education (MoE). There has been a big upset with the Indian Institute of Science (IISc) being dislodged by IIT Madras as the top institute in the country. Apart from this, the rest seemed business as usual. How seriously should we take this annual educational tournoi and how does it compare with the global systems?

The world over, ranking educational institutes is a matter of debate and research. There are at least 20 global ranking agencies that measure quality on various parameters. Australia has the Research Performance Index that measures universities’ performance. The Centre for Science and Technology Studies at Leiden University maintains European and worldwide rankings of the top 500 universities based on the number and impact of Web of Science-indexed publications per year. The QS World University Rankings have been published annually since 2004. In 2009, QS even launched the QS Asian University Rankings in partnership with the Chosun Ilbo newspaper in South Korea. Times Higher Education (THE), a British publication, and Thomson Reuters have been providing a new set of world university rankings since 2011.

Interestingly, there is also a “Ranking of Rankings”, UniRanks, launched in 2017. It aggregates the results of five global rankings, combining them to form a single rank. It uses THE World University Ranking (22.5 per cent), QS World University Ranking (22.5 per cent), US News Best Global University (22.5 per cent), Academic Ranking of World Universities (22.5 per cent), and Reuters World Top 100 Innovative Universities (10 per cent).

NIRF ranking is based on six parameters — teaching-learning and resources, research and professional practice, graduation outcomes, outreach and inclusivity and perception about the institution. The overall score is computed based on the weightage allotted to each parameter and sub-parameter. Some data is provided by the institutions themselves and the rest is sourced from third-party sites.

The quality of an institution is a function of several inputs and the above indicators alone may not be sufficient. How can we include the skills that an institution/university imparts to its students as one of the important ingredients? Should the financial health and size of the institution not be a criterion? Should the financial benefits that accrue to the stakeholders, especially the students, not be linked to the ranking? Ideally, an objective function must be defined for an institution, with the desired attributes as variables and weightage apportioned to each such attribute that depends on their importance in the overall value pWhereas IISc, with 464 faculty members for 4,000 students, has a faculty ratio of 1:8.6, and receives about Rs 350 crore in central grants, BHU with 2,000 teachers for 32,000 students has a ratio of 1:16 and receives a grant of about Rs 200 crore. In the Union Budget 2021, whereas the government allocation was Rs 7,686 crore to the IITs, the total outlay for all Central Universities was Rs 7,643.26 crore. Some departments in the IITs have even better faculty ratios since they are not bound by the cadre rules applicable to state universities. While state university budgets are ridiculously low, they are all competing on the same quality parameters and are expected to outperform the better-endowed ones. Is it not time to also check the return on investment (ROI), especially when several of our students from elite institutions, educated on public money, don’t even serve within the country? Surely, ROI is an important parameter missed out in the NIRF Rankings.

The diversity in the Indian education system is large. There are fresh as well as old institutions offering degrees/diplomas/certifications. There are also technology vs social sciences institutions, multi-disciplinary vs single discipline, private vs public, research-based, innovation-based, language-based or even special-purpose institutions/universities. The boundary conditions in which they operate are very different. NIRF seems to be committing the same sin that the global rankings systems were once accused of — a one-size-fits-all approach.

Another glaring oversight is the disconnect that exists between the ranking and accreditation. Several universities have earned a NAAC A grade but figure poorly in the ranking system. NIRF must take into consideration the NAAC and NBA scores. Though the government has no role in the business of either ranking or accreditation, the least one can expect from the NBA or NAAC is that their left hand knows what the right hand is doing.

Ranking is a numbers game as after two iterations, institutes become adept at giving the data that maximise scores. Accreditation, on the other hand, is a peer-reviewed process and is often accused of subjectivity. Though both are imperfect, accreditation and Quality Assurance (QA) would probably be the de facto standard in the future, like in the US, as they allow stakeholders to sue the universities if they renege on delivering what they claim. A Bill to introduce such accountability was introduced in 2011 but it never saw the light of day.

Two factors that are absent and differentiate us from the global ranking systems are our lack of international faculty and students and the inadequacy of our research to connect with the industry. International faculty and students will come only if they see a value proposition in our institutions, an indicator of quality. Industry connect will happen only when the research translates into improved or new processes and products. Patents translated to products have value, not patents that are just filed. To make this happen, NIRF has to have top experts not only from the country but also from outside in its core committees.

Our institutions have been falling short on global expectations on both these counts from the beginning. Though, as a consequence, NIRF arrived post-2014 with parameters to assuage our ruffled egos, we must be pragmatic and realise that quality cannot be measured in a silo. Having let go of being compared on a global scale, our universities can choose to be rank insiders or rank outsiders.

This column first appeared in the print edition on October 6, 2021 under the title ‘How not to test quality’.
Mantha is former chairman, AICTE and Thakur is former Secretary, MHRD, GoI

Source: Indian Express, 6/10/21