Rank disorder in higher education
Two Indian institutes count among the world’s top 200 in the 2015 Quacquarelli Symonds (QS) ranking — IISC, Bangalore at 147 and IIT, Delhi at 179. International ranking agencies pep us up, thus, to keep us engaged in the frenzy about ranking higher education institutes. This is a strange obssession for a nation that is remarkably tolerant of sub-standard academic institutions and callously impervious to the neglect of even the good ones. Among international ranking agencies, QS and Times Higher Education Ranking (THER) have aggressively marketed themselves. These agencies also provide consultancy, at exorbitant costs to various institutions to help improve their ranks, notwithstanding the conflict of interest in teaching the tricks of the trade and then assessing it. Competition among institutions has translated into competition among ranking agencies and nations. The USA has its own agencies that rank US universities and the Chinese their own, both differing from QS and THER ranking methodologies. The ministry of human resource development is formulating its own ranking system appropriate to Indian institutions.
This counter strategy has been prompted because existing ranking methods can be subjective, arbitrary and commercially driven rather than academically, with perceptual rather than real value. A large portion of the overall weightage, ranging from 33 per cent to almost 50 per cent, is based on ‘reputation surveys’ whose process is quite non-transparent. There is a significant weightage on internationalisation (about 15 per cent) in which Indian universities get almost zero. The citation data is often disputed and its calculation base is unclear. Faculty-student ratios have 15 per cent weightage. Some agencies include non-faculty research staff with doctorates as equivalent to faculty. This disadvantages Indian institutes as they lack a large pool of post-doctoral fellows or a scientific cadre parallel to faculty. Leading institutions like the IITs teach engineering and technology. However, they are ranked as comprehensive universities. When considered only in the engineering and technology category, the five older IITs fall within the top 100 ranks. In some areas they rank in the 30s. International rankings ignore aspects important to the IITs such as impact of alumni, national/international research collaborations with industry and research laboratories, and societal role in education. If internationalisation and nobel laureates are given zero weightage, then the IITs go up to the top 25 ranks.
Pitchbook, a global private equity & venture capital data company ranked academic institutions worldwide based on the number of alumni who founded US-based companies that received first round venture capital funding from 2010 to the end of the third quarter of 2013. The IITs ranked among the world’s top ten universities. So ranking is pretty much a matter of perspectives and anyone can look good or bad depending on the matrices used, time period under reference, distributed weightage and the peer group.
Ranking can create serious distortions in the system. A mad race towards improving ranks have spawned scandals and spurious quality. A report in Science magazine (December 16, 2013) on papers in journals indexed by Thomson Reuters’ Science Citation Index (SCI) found a flourishing academic black market involving scientists, agencies and journal editors, with fees ranging from $1,600 to $26,300 for hired researchers to write articles for journals with high impact factor. In India, research papers in Nature or Science are given the same value as in an obscure journal.
The ranking systems do throw up odd truths, but this is not so much for the ranked instititutions as for the government. Foremost among these is that the government must increase its funding for education to build their infrastructure, undertake significantly large research programmes to improve the quality and quantity of research output and citations, expand post-doctoral programmes, and attract high-quality international students and faculty. Funding has a major impact on the growth of academic institutions. Massive government investment in infrastructure, research, faculty, post-doctoral students, internationalisation and high quality research staff have raised the ranks of NUS and NTU (in Singapore). Much below the IITs in the early 80s, they have now surpassed the IITs in post-graduate education and research outcome parameters, which significantly affect such rankings. The government must also review archaic regulatory systems that restrict academic growth. Finally, we need to ask if the government expects better university rankings, despite poor school performance? In 2009, India came second-last among 73 nations, just ahead of Kyrgyzstan in the Organization for Economic Cooperation and Development’s PISA test that measures the performance of 15-year-olds in reading, maths and science. India pulled out of the PISA tests of 2012 and 2015. But can one have quality higher education with the second-lowest quality of elementary education? And is there any sense in ranking frameworks, then?
The central issue is steadfast dedication by the state in creating an environment that enables improvement in the quality of education offered. Is there any sense in starving a child and then measuring its weight and scolding it for being underweight compared with the neighbours brood fattened by love and care?
This counter strategy has been prompted because existing ranking methods can be subjective, arbitrary and commercially driven rather than academically, with perceptual rather than real value. A large portion of the overall weightage, ranging from 33 per cent to almost 50 per cent, is based on ‘reputation surveys’ whose process is quite non-transparent. There is a significant weightage on internationalisation (about 15 per cent) in which Indian universities get almost zero. The citation data is often disputed and its calculation base is unclear. Faculty-student ratios have 15 per cent weightage. Some agencies include non-faculty research staff with doctorates as equivalent to faculty. This disadvantages Indian institutes as they lack a large pool of post-doctoral fellows or a scientific cadre parallel to faculty. Leading institutions like the IITs teach engineering and technology. However, they are ranked as comprehensive universities. When considered only in the engineering and technology category, the five older IITs fall within the top 100 ranks. In some areas they rank in the 30s. International rankings ignore aspects important to the IITs such as impact of alumni, national/international research collaborations with industry and research laboratories, and societal role in education. If internationalisation and nobel laureates are given zero weightage, then the IITs go up to the top 25 ranks.
Pitchbook, a global private equity & venture capital data company ranked academic institutions worldwide based on the number of alumni who founded US-based companies that received first round venture capital funding from 2010 to the end of the third quarter of 2013. The IITs ranked among the world’s top ten universities. So ranking is pretty much a matter of perspectives and anyone can look good or bad depending on the matrices used, time period under reference, distributed weightage and the peer group.
Ranking can create serious distortions in the system. A mad race towards improving ranks have spawned scandals and spurious quality. A report in Science magazine (December 16, 2013) on papers in journals indexed by Thomson Reuters’ Science Citation Index (SCI) found a flourishing academic black market involving scientists, agencies and journal editors, with fees ranging from $1,600 to $26,300 for hired researchers to write articles for journals with high impact factor. In India, research papers in Nature or Science are given the same value as in an obscure journal.
The ranking systems do throw up odd truths, but this is not so much for the ranked instititutions as for the government. Foremost among these is that the government must increase its funding for education to build their infrastructure, undertake significantly large research programmes to improve the quality and quantity of research output and citations, expand post-doctoral programmes, and attract high-quality international students and faculty. Funding has a major impact on the growth of academic institutions. Massive government investment in infrastructure, research, faculty, post-doctoral students, internationalisation and high quality research staff have raised the ranks of NUS and NTU (in Singapore). Much below the IITs in the early 80s, they have now surpassed the IITs in post-graduate education and research outcome parameters, which significantly affect such rankings. The government must also review archaic regulatory systems that restrict academic growth. Finally, we need to ask if the government expects better university rankings, despite poor school performance? In 2009, India came second-last among 73 nations, just ahead of Kyrgyzstan in the Organization for Economic Cooperation and Development’s PISA test that measures the performance of 15-year-olds in reading, maths and science. India pulled out of the PISA tests of 2012 and 2015. But can one have quality higher education with the second-lowest quality of elementary education? And is there any sense in ranking frameworks, then?
The central issue is steadfast dedication by the state in creating an environment that enables improvement in the quality of education offered. Is there any sense in starving a child and then measuring its weight and scolding it for being underweight compared with the neighbours brood fattened by love and care?
Source | Financial Chronicle | 1 October 2015
Regards
No comments:
Post a Comment