Abstract Citation-based indicators of journals’ performance are often assumed to offer an objective, albeit indirect, way of measuring research quality. However, recent concerns about their applicability for research evaluation suggested these indicators could depend on historical and socioeconomic factors associated with scholarly publishing tradition and business, respectively. The present study addressed this issue quantitatively, using data on h-index and Scimago Journal Rank (SJR) for 566 journals within the fields of ecology and evolutionary biology, and applying Partial Least Squared Structural Equation Modelling. The Tradition Model accounted for <50% of the variation in h-index and SJR, showing that journals’ performance increased with an increase in articles’ international collaboration, and decreased for journals published by non-profit organizations. The Business Model accounted for >60% of the variation in h-index and SJR, showing that journals’ performance increased in association with the global50 ranking of publishers and high article processing charges. Countries recognized as world science centres, the use of English, the journal’s, and publisher’s year of origin, and the increase in science investment and scientific production promoted by the richest economies worldwide had no impact on journal performance. Results suggest that the h-index for journals and the SJR reflect multi-dimensional aspects of scholarly publishing, potentially affected by marketing strategies boosted by the biggest commercial publishers. Given the limitations of poor scientific communities in terms of publication costs, uncritical application of these indexes for research evaluation worldwide may reinforce the idea that high quality research is produced only by rich scientific societies.
Read full abstract