Abstract

BackgroundGuidelines for assessing methodological and reporting quality of systematic reviews (SRs) were developed to contribute to implementing evidence-based health care and the reduction of research waste. As SRs assessing a cohort of SRs is becoming more prevalent in the literature and with the increased uptake of SR evidence for decision-making, methodological quality and standard of reporting of SRs is of interest. The objective of this study is to evaluate SR adherence to the Quality of Reporting of Meta-analyses (QUOROM) and PRISMA reporting guidelines and the A Measurement Tool to Assess Systematic Reviews (AMSTAR) and Overview Quality Assessment Questionnaire (OQAQ) quality assessment tools as evaluated in methodological overviews.MethodsThe Cochrane Library, MEDLINE®, and EMBASE® databases were searched from January 1990 to October 2014. Title and abstract screening and full-text screening were conducted independently by two reviewers. Reports assessing the quality or reporting of a cohort of SRs of interventions using PRISMA, QUOROM, OQAQ, or AMSTAR were included. All results are reported as frequencies and percentages of reports and SRs respectively.ResultsOf the 20,765 independent records retrieved from electronic searching, 1189 reports were reviewed for eligibility at full text, of which 56 reports (5371 SRs in total) evaluating the PRISMA, QUOROM, AMSTAR, and/or OQAQ tools were included. Notable items include the following: of the SRs using PRISMA, over 85% (1532/1741) provided a rationale for the review and less than 6% (102/1741) provided protocol information. For reports using QUOROM, only 9% (40/449) of SRs provided a trial flow diagram. However, 90% (402/449) described the explicit clinical problem and review rationale in the introduction section. Of reports using AMSTAR, 30% (534/1794) used duplicate study selection and data extraction. Conversely, 80% (1439/1794) of SRs provided study characteristics of included studies. In terms of OQAQ, 37% (499/1367) of the SRs assessed risk of bias (validity) in the included studies, while 80% (1112/1387) reported the criteria for study selection.ConclusionsAlthough reporting guidelines and quality assessment tools exist, reporting and methodological quality of SRs are inconsistent. Mechanisms to improve adherence to established reporting guidelines and methodological assessment tools are needed to improve the quality of SRs.

Highlights

  • Guidelines for assessing methodological and reporting quality of systematic reviews (SRs) were developed to contribute to implementing evidence-based health care and the reduction of research waste

  • Of the 20,765 independent records retrieved from electronic searching, 1189 reports were reviewed in relation to a subset of the eligibility at full text, of which 935 were excluded for either not assessing a cohort of Systematic review (SR) or the primary intent was not to assess methodological quality (MQ) or reporting quality (RQ)

  • The majority of reports were conducted with the intent to assess MQ or RQ using an appropriate tool; 61% (34/56) of reports had a primary intent to assess MQ only, 7% (4/56) reported having a primary intent to assess RQ, and 27% (15/56) had a primary intent to assess both MQ and RQ

Read more

Summary

Introduction

Guidelines for assessing methodological and reporting quality of systematic reviews (SRs) were developed to contribute to implementing evidence-based health care and the reduction of research waste. Systematic reviews (SRs) are considered the gold standard for evidence used to evaluate the benefits and harms of healthcare interventions. They are powerful tools used to assess treatment effectiveness which can subsequently improve patient care [1]. Poorly conducted or reported SRs may be associated with bias, limiting their usefulness [5]. When SRs comply with established methodology, report findings transparently, and are free of bias, they provide relevant information for practice guideline developers and other stakeholders such as policy makers [5].

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call