PurposeAcademic rankings are facing various issues, including the use of data sources that are not publicly verifiable, subjective parameters, a narrow focus on research productivity and regional biases and so forth. This research work is intended to enhance creditability of the ranking process by using the objective indicators based on publicly verifiable data sources.Design/methodology/approachThe proposed ranking methodology – OpenRank – drives the objective indicators from two well-known publicly verifiable data repositories: the ArnetMiner and DBpedia.FindingsThe resultant academic ranking reflects common tendencies of the international academic rankings published by the Shanghai Ranking Consultancy (SRC), Quacquarelli Symonds (QS) and Times Higher Education (THE). Evaluation of the proposed methodology advocates its effectiveness and quick reproducibility with low cost of data collection.Research limitations/implicationsImplementation of the OpenRank methodology faced the issue of availability of the quality data. In future, accuracy of the academic rankings can be improved further by employing more relevant public data sources like the Microsoft Academic Graph, millions of graduate's profiles available in the LinkedIn repositories and the bibliographic data maintained by Association for Computing Machinery and Scopus and so forth.Practical implicationsThe suggested use of open data sources would offer new dimensions to evaluate academic performance of the higher education institutions (HEIs) and having comprehensive understanding of the catalyst factors in the higher education.Social implicationsThe research work highlighted the need of a purposely built, publicly verifiable electronic data source for performance evaluation of the global HEIs. Availability of such a global database would help in better academic planning, monitoring and analysis. Definitely, more transparent, reliable and less controversial academic rankings can be generated by employing the aspired data source.Originality/valueWe suggested a satisfying solution for improvement of the HEIs' ranking process by making the following contributions: (1) enhancing creditability of the ranking results by merely employing the objective performance indicators extracted from the publicly verifiable data sources, (2) developing an academic ranking methodology based on the objective indicators using two well-known data repositories, the DBpedia and ArnetMiner and (3) demonstrating effectiveness of the proposed ranking methodology on the real data sources.
Read full abstract