Abstract

We comment on a letter toNature in 1996 on the long term decline of Indian science pointing out methodological reasons why the (SCI) data used by the authors do not unambiguously lead to their stated conclusions. Our arguments are based on the contention that no valid statement on change in a country's output may be made for a period in which the journal coverage from that country in SCI has changed significantly. We have suggested that for longitudinal comparisons of country level performance, it should be verified that the journals from that country in SCI remained constant within the period. This could be ensured if the country of publication of journals could be included as a field in the SCI database. We define a Visibility Index as the cumulated impact and derive a relation to estimate change in visibility combining changes in output and average impact. In the period during which Indian journal coverage remained unchanged, a detailed analysis of output for two years (1990–94) leads us to conclude that, with the exception of Agriculture, there has been an increase in publication in virtually every field, with significant increase in the overall mean Impact Factor. At least 25 subfields have been identified with statistically significant increase in mean Impact Factor and Visibility. The impact of foreign collaboration on visibility has also been considered. In conclusion we touch upon the question of citation as a performance indicator for Third World countries as high citation and relevance may be in conflict as objectives.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call