Abstract
A Review of:
 Hanneke, R., & O’Brien, K. K. (2016). Comparison of three web-scale discovery services for health sciences research. Journal of the Medical Library Association, 104(2), 109-117. http://dx.doi.org/10.3163/1536-5050.104.2.004
 
 Abstract
 
 Objective – To compare the results of health sciences search queries in three web-scale discovery (WSD) services for relevance, duplicate detection, and retrieval of MEDLINE content.
 
 Design – Comparative evaluation and bibliometric study.
 
 Setting – Six university libraries in the United States of America.
 
 Subjects – Three commercial WSD services: Primo, Summon, and EBSCO Discovery Service (EDS).
 
 Methods – The authors collected data at six universities, including their own. They tested each of the three WSDs at two data collection sites. However, since one of the sites was using a legacy version of Summon that was due to be upgraded, data collected for Summon at this site were considered obsolete and excluded from the analysis. 
 
 The authors generated three questions for each of six major health disciplines, then designed simple keyword searches to mimic typical student search behaviours. They captured the first 20 results from each query run at each test site, to represent the first “page” of results, giving a total of 2,086 total search results. These were independently assessed for relevance to the topic. Authors resolved disagreements by discussion, and calculated a kappa inter-observer score. They retained duplicate records within the results so that the duplicate detection by the WSDs could be compared.
 
 They assessed MEDLINE coverage by the WSDs in several ways. Using precise strategies to generate a relevant set of articles, they conducted one search from each of the six disciplines in PubMed so that they could compare retrieval of MEDLINE content. These results were cross-checked against the first 20 results from the corresponding query in the WSDs. To aid investigation of overall coverage of MEDLINE, they recorded the first 50 results from each of the 6 PubMed searches in a spreadsheet. During data collection at the WSD sites, they searched for these references to discover if the WSD tool at each site indexed these known items.
 
 Authors adopted measures to control for any customisation of the product setup at each data collection site. In particular, they excluded local holdings from the results by limiting the searches to scholarly, peer-reviewed articles.
 
 Main results – Authors reported results for 5 of the 6 sites. All of the WSD tools retrieved between 50-60% relevant results. EDS retrieved the highest number of relevant records (195/360 and 216/360), while Primo retrieved the lowest (167/328 and 169/325). There was good observer agreement (k=0.725) for the relevance assessment. The duplicate detection rate was similar in EDS and Summon (between 96-97% unique articles), while the Primo searches returned 82.9-84.9% unique articles.
 
 All three tools retrieved relevant results that were not indexed in MEDLINE, and retrieved relevant material indexed in MEDLINE that was not retrieved in the PubMed searches. EDS and Summon retrieved more non-MEDLINE material than Primo. EDS performed best in the known-item searches, with 300/300 and 299/300 items retrieved, while Primo performed worst with 230/300 and 267/300 items retrieved.
 
 The Summon platform features an “automated query expansion” search function, where user-entered keywords are matched to related search terms and these are automatically searched along with the original keyword. The authors observed that this function resulted in a wholly relevant first page of results for one of the search questions tested in Summon.
 
 Conclusion – While EDS performed slightly better overall, the difference was not great enough in this small sample of test sites to recommend EDS over the other tools being tested. The automated query expansion found in Summon is a useful function that is worthy of further investigation by the WSD vendors. The ability of the WSDs to retrieve MEDLINE content through simple keyword searches demonstrates the potential value of using a WSD tool in health sciences research, particularly for inexpert searchers.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.