Abstract
BackgroundRapid reviews are an accelerated evidence synthesis approach intended to meet the timely needs of decision-makers in healthcare settings. Quality of conduct and reporting has been described in the rapid review literature; however, no formal assessment has been carried out using available instruments. The objective of this study was to explore compliance with conduct and reporting guidelines in rapid reviews published or posted online during 2013 and 2014.MethodsWe performed a comprehensive literature search for rapid reviews using multiple bibliographic databases (e.g. PubMed, MEDLINE, EMBASE, the Cochrane Library) through December 31, 2014. Grey literature was searched thoroughly, and health technology assessment agencies were surveyed to identify additional rapid review products. Candidate reviews were assessed for inclusion using pre-specified eligibility criteria. Detailed data was collected from the included reviews on study and reporting characteristics and variables significant to rapid reviews (e.g. nomenclature, definition). We evaluated the quality of conduct and reporting of included rapid reviews using the A Measurement Tool to Assess Systematic Reviews (AMSTAR) and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklists. Compliance with each checklist item was examined, and the sum of adequately reported items was used to describe overall compliance. Rapid reviews were stratified to explore differences in compliance related to publication status. The association between compliance and time to completion or length of publication was explored through univariate regression.ResultsSixty-six rapid reviews were included. There were heterogeneous nomenclature, research questions and approaches to rapid reviews. Compliance with AMSTAR and PRISMA checklists was poor. Published rapid reviews were compliant with individual PRISMA items more often than unpublished reviews, but no difference was seen in AMSTAR item compliance overall. There was evidence of an association between length of publication and time to completion and the number of adequately reported PRISMA or AMSTAR items.ConclusionsTransparency and inadequate reporting are significant limitations of rapid reviews. Scientific editors, authors and producing agencies should ensure that the reporting of conduct and findings is accurate and complete. Further research may be warranted to explore reporting and conduct guidelines specific to rapid reviews and how these guidelines may be applied across the spectrum of rapid review approaches.Electronic supplementary materialThe online version of this article (doi:10.1186/s13643-016-0258-9) contains supplementary material, which is available to authorized users.
Highlights
IntroductionRapid reviews are an accelerated evidence synthesis approach intended to meet the timely needs of decision-makers in healthcare settings
Selection of rapid review samples Fourteen health technology assessment (HTA) agencies responded to the INAHTA scan, and their external web sites were searched for relevant rapid review products following the search for published and unpublished reports
Standardized reporting and conduct checklists such as A Measurement Tool to Assess Systematic Reviews (AMSTAR) and PRISMA provide a useful way to compare and contrast rapid reviews across a number of key domains. This assessment of 66 rapid reviews shows that conduct and reporting are often inadequate and unclear
Summary
Rapid reviews are an accelerated evidence synthesis approach intended to meet the timely needs of decision-makers in healthcare settings. The time needed to complete a full systematic review of the literature often exceeds the time that end-users have to evaluate evidence or incorporate it into their processes. Rapid reviews are an accelerated evidence synthesis approach intended to meet the needs of knowledge users in healthcare settings [1, 2]. Heterogeneity of rapid review approaches and poor reporting of methods or processes have been consistently observed, making evaluation of these evidence products difficult [6]. This, in turn, makes it difficult for decision-makers to quantify any bias that may have been introduced or to judge how much value to place on the evidence contained in a rapid review
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.