Abstract

AbstractA variety of aspects related to testing of retrieval systems were examined. A model of a retrieval system, together with a set of measures and a methodology for performance testing were developed. In the main experiment the effect on performance of the following variables was tested: sources of indexing, indexing languages, coding schemes, question analyses, search strategies and formats of output. In addition, a series of separate experiments was carried out to investigate the problems of controls in experimentation with IR systems. The main conclusions: the human factor appears to be the main variable in all components of an IR system; length of indexes affects performance considerably more than indexing languages; question analyses and search strategies affect performance to a great extent—as much, if not more than indexing. Retrieval systems seem to be able to perform at present only on a general level, failing to be at the same time comprehensive and specific. It seems that testing of total IR systems controlling and monitoring all factors (environmental and systems‐related) is not possible at present.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call