Followers

Showing posts with label NIRF. Show all posts
Showing posts with label NIRF. Show all posts

Friday, June 09, 2023

National Institutional Ranking Framework (NIRF) Rankings 2023

 The National Institutional Ranking Framework (NIRF) Rankings 2023 were recently released by Minister of State for Education and External Affairs, Rajkumar Ranjan Singh. These rankings serve as a significant benchmark to assess the quality and performance of educational institutions across the country.

An Overview of NIRF Rankings 2023

The NIRF rankings encompass four major categories: Overall, Colleges, Universities, and Research Institutions. The rankings aim to provide valuable insights into the progress and performance of institutions in terms of academic excellence, research output, and overall institutional quality. These rankings are available on the official website of NIRF at nirfindia.org.

Expanding Categories and Dimensions

In its eighth edition, the NIRF rankings have expanded from four to twelve categories, including eight subject-specific rankings. This expansion allows for a more comprehensive assessment of institutions across various domains. The subject domains now include Engineering, Management, Pharmacy, Law, Medical, Architecture and Planning, Dental, and a new addition—Agriculture and Allied Sectors.

Top Performers in the Overall Category

In the overall category, the Indian Institute of Technology (IIT) Madras secured the top position, maintaining its success from the previous year. The Indian Institute of Science (IISc) in Bengaluru emerged as the leading university, reaffirming its commitment to academic excellence. Additionally, IISc Bangalore was recognized as the second-best institution in the overall category.

Excellence in Universities

In the university rankings, IISc Bengaluru retained its top position, showcasing its consistent commitment to research and academic prowess. Jawaharlal Nehru University (JNU) and Jamia Millia Islamia (JMI) secured the second and third spots, respectively, highlighting their contributions to the academic landscape of the country.

Leaders in Engineering

When it comes to engineering institutes, IIT Madras emerged as the top performer for the second consecutive year. IIT Delhi, IIT Bombay, IIT Kanpur, and IIT Roorkee followed closely, showcasing their exceptional contributions to engineering education and research.

Noteworthy Management Institutions

In the field of management, the Indian Institute of Management (IIM) Ahmedabad secured the top spot, demonstrating its excellence in producing business leaders. IIM Bangalore, IIM Kozhikode, IIM Calcutta, and IIM Delhi were also recognized for their outstanding contributions to management education.

Recognizing Achievements in Other Domains

The NIRF rankings also acknowledge the top performers in other domains such as Pharmacy, Colleges, Medical, Research Institutions, Innovation, Law, Architecture, Dental, and Agriculture and Allied Sectors. These rankings shed light on the institutions that have excelled in their respective fields and contribute significantly to the growth of those domains.

The Significance of NIRF Rankings

The NIRF rankings play a crucial role in evaluating and improving the quality of higher education in India. By providing a comprehensive assessment and comparison of institutions, the rankings help students make informed decisions about their academic pursuits. The NIRF rankings align with the government’s efforts to enhance the quality and accessibility of education across the nation.

With the expansion of categories and a growing number of participating institutions, the NIRF rankings continue to evolve and become more inclusive. They serve as a reliable guide for students, parents, and educational stakeholders, enabling them to identify and appreciate the best institutions in India.

Monday, July 18, 2022

NIRF Rankings 2022 – Highlights

 NIRF Rankings 2022 were released on July 15, 2022 by the Union Education Minister Dharmendra Pradhan.


Key Points

  • In year 2022, rankings were announced in 11 categories viz., Engineering, Colleges, Universities, Management, Overall, Medical, Law, Architecture, Research, Dental, and Pharmacy.
  • Number of colleges or universities or institutions taking part in NIRF ranking has been increasing over the year.
  • In 2021 rankings, 6,000 colleges took part for eleven categories. Indian Institute of Management (IIM) Ahmedabad was ranked first that year, in management category. It was followed by IIM Bangalore, and IIM Kolkata.

NIRF Rankings 2022-highlights

  • In Engineering institutes category, IIT Madras has been ranked at first position.
  • In Medical college category, AIIMS New Delhi has been ranked as best.
  • Savitha Institute of Medical and Technical Sciences, Chennai was ranked best in dental college category.
  • Miranda House was tagged as best college, followed by Hindu college and Presidency college.
  • IIM Ahmedabad was ranked as best B-school in India, in management category. IIM Bangalore ranked 2nd while IIM Calcutta 3rd.

What is NIRF Ranking?

The National Institutional Ranking Framework (NIRF) is provided by Ministry of Human resource development. Ministry has launched the NIRF on September 29, 2025. It provides a methodology for providing rank to institutions across India in different categories. This methodology is prepared from overall recommendations and broad understanding, which was arrived at by a Core Committee. This committee was formed by MHRD, in a bid to identify broad parameters of providing ranks to universities and institutions. The parameters cover- Teaching, Learning & Resources; Graduation outcomes; Research & Professional Practices; Perception and Outreach & Inclusivity.

Wednesday, October 06, 2021

NIRF ranking does not give full picture of higher education in India

 

S S Mantha, Ashok Thakur writes: It is based on limited parameters and seems to be committing the same sin that the global rankings systems were once accused of — a one-size-fits-all approach.


The National Institutional Ranking Framework (NIRF) of universities and colleges for the year 2021 was released on September 9 by the Ministry of Education (MoE). There has been a big upset with the Indian Institute of Science (IISc) being dislodged by IIT Madras as the top institute in the country. Apart from this, the rest seemed business as usual. How seriously should we take this annual educational tournoi and how does it compare with the global systems?

The world over, ranking educational institutes is a matter of debate and research. There are at least 20 global ranking agencies that measure quality on various parameters. Australia has the Research Performance Index that measures universities’ performance. The Centre for Science and Technology Studies at Leiden University maintains European and worldwide rankings of the top 500 universities based on the number and impact of Web of Science-indexed publications per year. The QS World University Rankings have been published annually since 2004. In 2009, QS even launched the QS Asian University Rankings in partnership with the Chosun Ilbo newspaper in South Korea. Times Higher Education (THE), a British publication, and Thomson Reuters have been providing a new set of world university rankings since 2011.

Interestingly, there is also a “Ranking of Rankings”, UniRanks, launched in 2017. It aggregates the results of five global rankings, combining them to form a single rank. It uses THE World University Ranking (22.5 per cent), QS World University Ranking (22.5 per cent), US News Best Global University (22.5 per cent), Academic Ranking of World Universities (22.5 per cent), and Reuters World Top 100 Innovative Universities (10 per cent).

NIRF ranking is based on six parameters — teaching-learning and resources, research and professional practice, graduation outcomes, outreach and inclusivity and perception about the institution. The overall score is computed based on the weightage allotted to each parameter and sub-parameter. Some data is provided by the institutions themselves and the rest is sourced from third-party sites.

The quality of an institution is a function of several inputs and the above indicators alone may not be sufficient. How can we include the skills that an institution/university imparts to its students as one of the important ingredients? Should the financial health and size of the institution not be a criterion? Should the financial benefits that accrue to the stakeholders, especially the students, not be linked to the ranking? Ideally, an objective function must be defined for an institution, with the desired attributes as variables and weightage apportioned to each such attribute that depends on their importance in the overall value pWhereas IISc, with 464 faculty members for 4,000 students, has a faculty ratio of 1:8.6, and receives about Rs 350 crore in central grants, BHU with 2,000 teachers for 32,000 students has a ratio of 1:16 and receives a grant of about Rs 200 crore. In the Union Budget 2021, whereas the government allocation was Rs 7,686 crore to the IITs, the total outlay for all Central Universities was Rs 7,643.26 crore. Some departments in the IITs have even better faculty ratios since they are not bound by the cadre rules applicable to state universities. While state university budgets are ridiculously low, they are all competing on the same quality parameters and are expected to outperform the better-endowed ones. Is it not time to also check the return on investment (ROI), especially when several of our students from elite institutions, educated on public money, don’t even serve within the country? Surely, ROI is an important parameter missed out in the NIRF Rankings.

The diversity in the Indian education system is large. There are fresh as well as old institutions offering degrees/diplomas/certifications. There are also technology vs social sciences institutions, multi-disciplinary vs single discipline, private vs public, research-based, innovation-based, language-based or even special-purpose institutions/universities. The boundary conditions in which they operate are very different. NIRF seems to be committing the same sin that the global rankings systems were once accused of — a one-size-fits-all approach.

Another glaring oversight is the disconnect that exists between the ranking and accreditation. Several universities have earned a NAAC A grade but figure poorly in the ranking system. NIRF must take into consideration the NAAC and NBA scores. Though the government has no role in the business of either ranking or accreditation, the least one can expect from the NBA or NAAC is that their left hand knows what the right hand is doing.

Ranking is a numbers game as after two iterations, institutes become adept at giving the data that maximise scores. Accreditation, on the other hand, is a peer-reviewed process and is often accused of subjectivity. Though both are imperfect, accreditation and Quality Assurance (QA) would probably be the de facto standard in the future, like in the US, as they allow stakeholders to sue the universities if they renege on delivering what they claim. A Bill to introduce such accountability was introduced in 2011 but it never saw the light of day.

Two factors that are absent and differentiate us from the global ranking systems are our lack of international faculty and students and the inadequacy of our research to connect with the industry. International faculty and students will come only if they see a value proposition in our institutions, an indicator of quality. Industry connect will happen only when the research translates into improved or new processes and products. Patents translated to products have value, not patents that are just filed. To make this happen, NIRF has to have top experts not only from the country but also from outside in its core committees.

Our institutions have been falling short on global expectations on both these counts from the beginning. Though, as a consequence, NIRF arrived post-2014 with parameters to assuage our ruffled egos, we must be pragmatic and realise that quality cannot be measured in a silo. Having let go of being compared on a global scale, our universities can choose to be rank insiders or rank outsiders.

This column first appeared in the print edition on October 6, 2021 under the title ‘How not to test quality’.
Mantha is former chairman, AICTE and Thakur is former Secretary, MHRD, GoI

Source: Indian Express, 6/10/21



Friday, August 28, 2020

Why global university rankings miss Indian educational institutions

 

Since universities are complex organisations with multiple objectives, comparing universities using a single numerical value is as ineffectual as comparing a civil engineer with a biologist or a linguist and a dancer.


The best indicators of a university’s performance are the learning outcomes and how its education has impacted the students and society. The hype surrounding the announcement of world university rankings by international ranking organisations is unfortunate. Regardless of whether the rankings are beneficial or not, more universities than ever before want to get into these rankings. The obsession to be within the top 100 universities in the world is exasperating. Since there is a potential danger of creating elitism among universities through this ranking, lower-ranked universities may lose out on many counts. Some top-ranked universities want to collaborate only with other top-ranked universities, impairing the less fortunate ones to further sink due to inescapable stigmatisation.

International ranking organisations also force universities to alter their core missions. This has happened with JNU. Although JNU ranks between 100 and 200 in certain disciplines, it does not find a place in world university rankings. The reason is JNU does not offer many undergraduate programmes. We were indirectly told to start more undergraduate programmes in order to scale the ranking order while our university is predominantly a research-oriented institution.

First, let me state the obvious. Indian institutions lose out on perception, which carries almost 50 per cent weightage in many world university ranking schemes. Psychologists know that perception is a result of different stimuli such as knowledge, memories, and expectancies of people. While one can quantitatively measure the correlation between stimuli and perception, perception cannot be a quantifiable standalone parameter. Therefore, perception as a major component in the ranking process can easily lead to inaccurate or unreasonable conclusions.

Rightly or wrongly, international ranking organisations use citations as a primary indicator of productivity and scientific impact a discipline makes. However, studies show that the number of citations per paper is highest in multidisciplinary sciences, general internal medicine, and biochemistry, and it is the lowest in subjects such as visual and performing arts, literature and architecture. It is nobody’s case that the latter subjects are of any less importance. By making citations of published papers from a university as a strong parameter for rankings, we seem to have developed an inexplicable blind spot when it comes to the differences among subject disciplines. It is no wonder that universities such as JNU, whose student intake in science research programmes is less as compared to the other disciplines, will loose out in world university rankings although it has been rated as the second-best university in India.

International ranking organisations are too rigid in their methodology and are not willing to add either additional parameters or change the weightage of current parameters. They are disinclined to employ meaningful and universally fair benchmarks of quality and performance. This is an absolute requisite to take into account the diversity that prevails among the universities. Some Indian higher education institutions even decided not to participate in the world university rankings alleging a lack of transparency in the parameters that are used in the ranking process.

Since universities are complex organisations with multiple objectives, comparing universities using a single numerical value is as ineffectual as comparing a civil engineer with a biologist or a linguist and a dancer. Hence, the danger that such skewed world rankings will downgrade the university education to a mere commodity is a realistic trepidation. This inelastic stance of ranking organisations has forced more than 70 countries to have their own national ranking systems for higher educational institutions.

I had argued in an editorial in IETE Technical Review (March 2015) for India to have its own national ranking system. The MHRD established the National Institutional Ranking Framework (NIRF) in 2016. The parameters used by NIRF for ranking Indian institutions are also most suited for many other countries — among the parameters are teaching, learning & resources, research and professional practice, graduation outcomes, outreach and inclusivity and peer perception. Unlike international ranking organisations, NIRF gives only 10 per cent weightage for perception.

In 2016, the NIRF rankings were given in four categories — University, Engineering, Management and Pharmacy. College, Medical, Law, Architecture and Dental were added in 2020. This shows how NIRF is refining its ranking methodology by taking inputs from the stakeholders, which the international ranking organisations seldom do. No right-minded person can plausibly argue against such a ranking system, which recognises and promotes the diversity and intrinsic strengths of Indian educational institutes.

International ranking organisations are often sightless about what it takes to build a world-class educational system as compared to a world-class university. If a country has a world-class educational system with a focus on innovation, best teaching-learning processes, research-oriented towards social good, affirmative action plans for inclusive and accessible education, it will have a more visible social and economic impact.

Indian higher educational institutes need to ask themselves: What positive role can they play in improving the quality of higher education? What can we do to adopt innovative approaches to become future ready? And they need to act on those questions to make a change and plan beyond what is obvious.

NIRF will stimulate healthy competition among Indian educational institutes, which should eventually lead to a world-class Indian educational system. This system will act as a catalyst for the transformation of local universities to world-class institutions.

This article first appeared in the print edition on August 28, 2020 under the title ‘Home and the world’. The writer is Vice-chancellor, JNU.

Source: Indian Express, 28/08/20

Friday, June 26, 2020

Are Some IITs Over-Pampered and Underperforming


The fifth edition of the National Institutional Ranking Framework (NIRF) was released online June 11. This year, seven of the top 10 places have gone to the IITs. In all five instalments, the old and preeminent IITs have been at the top spots.
On June 14, I had pointed out that the NIRF methodology had an important flaw. It builds a single score from five categories: teaching, learning and resources (TLR), research and professional practices (RPC), graduation outcomes (GO), outreach and inclusivity (OI), and perception. These five broad heads are built up from various sub-heads, and a complex weighting and addition scheme is used to obtain the overall rating score, which can take a maximum value of 100. The institutions are finally rank-ordered based on these scores.
The flaw is that output and input scores are added to obtained the final score instead of being divided. Any performance analysis requires identifying an input score and a quality score based on the ratio of output (RPC and GO being appropriate proxies) to input (TLR). Instead, NIRF adds the inputs to the outputs, i.e. TLR, RPC and GO along with OI and perception, to get the final score. Note in addition that OI and perception relate neither to academic excellence nor research excellence, but these are added as well.
P. Sriram of IIT Madras recently pointed out in a personal communication that the NIRF uses performance parameters in several groups, including TLR, publications and graduation outcomes as reflecting “desirable” and “undesirable” traits, where the desirables get positive and high scores and undesirables get low scores. From this viewpoint, the TLR parameter receives a high score for a high faculty-student ratio, high spending on infrastructure, high PhD enrolment etc. NIRF adds up all the scores – but completely overlooks the systems’ paradigm. That is, from a systems viewpoint, the ‘best’ institute should be based on the ratio of outputs to inputs.
Now, nowhere in the NIRF portal can a measure be found for the size-dependent input nor a size-dependent proxy for the output, both vis-à-vis academic excellence. The systems approach allows one to compute quality as output score divided by input score. Professor Sriram suggested that capital expenditure can be a meaningful input measure, but a lot of experimenting showed that the major chunk of expenditure is committed to salaries of faculty members and non-teaching staff, and that it may be better to use ‘total expenditure’ as a proxy for the input.
Keeping this in mind, I computed the total expenditure (sum of capital and operating expenditures) as a proxy for input and the sum of RPC and GO scores as the output scores for the top 100 engineering institutes (in the current form of the NIRF rankings). Then the cost effectiveness of each institution becomes simply the ratio of output to input.
The table and figure below summarise what we found. IIT Indore and Jadavpur University stand out on this basis. Many National Institutes of Technology get into the top positions. And most of the preeminent IITs vanish from this list, as the law of diminishing returns comes to the fore. So perhaps we must ask if some of the IITs are over-pampered and underperforming.
Source: The Wire, 25/06/20

Thursday, June 25, 2020

A Flaw in the NIRF Rankings – and a Fix

The Ministry of Human Resource Development released the fifth edition of the National Institutional Ranking Framework on June 11. Since 2017, when the first edition was introduced, engineering institutions have dominated the list. This year, seven of the top 10 places are occupied by the IITs. And in all five instalments, IIT Madras has never been dislodged from the top spot in the engineering rankings. Strangely, the other international ranking exercises, the Shanghai ARWU, the Times Higher Education and the Quacquarelli Symonds rankings, have rarely given IIT-M the topIIT Kharagpur and IIT Delhi now rank ahead of IIT Madras. Note the presence of Jadavpur University ahead of IIT Kanpur, and the presence of Vellore Institute of Technology ahead of IIT (Indian School of Mines) Dhanbad and IIT Guwahati. This is because the multiplied-score procedure recognises the fact that Jadavpur University and Vellore Institute of Technology leverage lower teaching and learning resources to produce relatively higher outcomes (research and graduation) than many privileged IITs. The NIRF score in its current form is unable to make this distinction. 
However, the ranking’s methods have a serious flaw (apart from the other well-known flaws that visit all ranking exercises). It builds a single score from five categories: teaching, learning and resources (TLR), research and professional practices (RPP), graduation outcomes (GO), outreach and inclusivity, and perception. These five broad heads are built up from various sub-heads, and a complex weighting and addition scheme is used to obtain the overall rating score, which can take a maximum value of 100. The institutions are finally rank-ordered based on these scores.If a ranking is really required, we can obtain a score obtained by multiplying the quality score and the output score. 
The flaw is that in a university system, TLR is technically ‘input’ and RPP and GO are ‘output’, and the NIRF adds the input and output scores to obtain the final score. This violates the basic principle of performance analyses: that performance is based on the input score and that quality is based on the ratio of output to the input. (Note that outreach and inclusivity and perception relate neither to academic nor research excellence but these are added as well.)
What happens when the NIRF scores are recomputed without adding the input and output? Let’s use an alternative two-dimensional paradigm, where the input is the ‘teaching, learning and resources’ (TLR) score, and the output is the sum of the ‘research and professional practices’ (RPP) and the ‘graduation outcomes’ (GO) scores. First, I normalise the values using the totals for RPP, GO and TLR for the top 100 engineering institutions in the NIRF 2020 list, then calculate the performance and quality scores.


Gangan Prathap is an aeronautical engineer and former scientist at the National Aeronautical Laboratory, Bangalore and former VC of Cochin University of Science and Technology. He is currently a professor at the A.P.J. Abdul Kalam Technological University, Thiruvananthapuram.
Source: The Wire: 14/06/2020