Abstract
The evaluation of scientific journals poses challenges owing to the existence of various impact measures. This is because journal ranking is a multidimensional construct that may not be assessed effectively using a single metric such as an impact factor. A few studies have proposed an ensemble of metrics to prevent the bias induced by an individual metric. In this study, a multi-metric journal ranking method based on the standardized average index (SA index) was adopted to develop an extended standardized average index (ESA index). The ESA index utilizes six metrics: the CiteScore, Source Normalized Impact per Paper (SNIP), SCImago Journal Rank (SJR), Hirsh index (H-index), Eigenfactor Score, and Journal Impact Factor from three well-known databases (Scopus, SCImago Journal & Country Rank, and Web of Science). Experiments were conducted in two computer science subject areas: (1) artificial intelligence and (2) computer vision and pattern recognition. Comparing the results of the multi-metric-based journal ranking system with the SA index, it was demonstrated that the multi-metric ESA index exhibited high correlation with all other indicators and significantly outperformed the SA index. To further evaluate the performance of the model and determine the aggregate impact of bibliometric indices with the ESA index, we employed unsupervised machine learning techniques such as clustering coupled with principal component analysis (PCA) and t-distributed stochastic neighbor embedding (t-SNE). These techniques were utilized to measure the clustering impact of various bibliometric indicators on both the complete set of bibliometric features and the reduced set of features. Furthermore, the results of the ESA index were compared with those of other ranking systems, including the internationally recognized Scopus, SJR, and HEC Journal Recognition System (HJRS) used in Pakistan. These comparisons demonstrated that the multi-metric-based ESA index can serve as a valuable reference for publishers, journal editors, researchers, policymakers, librarians, and practitioners in journal selection, decision making, and professional assessment.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.