Abstract

Purpose– The purpose of this paper is to evaluate the effectiveness of the information retrieval component of a daily newspaper publisher’s integrated library system (ILS) in comparison with the open source alternatives and observe the impact of the scale of metadata, generated daily by library administrators, on retrieved result sets.Design/methodology/approach– In Experiment 1, the authors compared the result sets of the information retrieval system (IRS) component of the publisher’s current ILS and the result sets of proposed ones with human-assessed relevance judgment set. In Experiment 2, the authors compared the performance of proposed IRS components with the publisher’s current production IRS, using result sets of current IRS classified as relevant. Both experiments were conducted using standard information retrieval (IR) evaluation methods: precision, recall, precision atk,F-measure, mean average precision and 11-point interpolated average precision.Findings– Results showed that: first, in Experiment 1, the publisher’s current production ILS ranked last of all participating IRSs when compared to a relevance document set classified by the senior library administrator; and second, in Experiment 2, the tested IR components’ request handlers that used only automatically generated metadata performed slightly better than request handlers that used all of the metadata fields. Therefore, regarding the effectiveness of IR, the daily human effort of generating the publisher’s current set of metadata attributes is unjustified.Research limitations/implications– The experiments’ collections contained Slovene language with large number of variations of the forms of nouns, verbs and adjectives. The results could be different if the experiments’ collections contained languages with different grammatical properties.Practical implications– The authors have confirmed, using standard IR methods, that the IR component used in the publisher’s current ILS, could be adequately replaced with an open source component. Based on the research, the publisher could incorporate the suggested open source IR components in practice. In the research, the authors have described the methods that can be used by libraries for evaluating the effectiveness of the IR of their ILSs.Originality/value– The paper provides a framework for the evaluation of an ILS’s IR effectiveness for libraries. Based on the evaluation results, the libraries could replace the IR components if their current information system setup allows it.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call