Abstract
ObjectivesTo examine consistency (interrater reliability) of applying guidance for grading strength of evidence in systematic reviews for the Agency for Healthcare Research and Quality Evidence-based Practice Center program. Study Design and SettingUsing data from two systematic reviews, authors tested the main components of the approach: (1) scoring evidence on the four required domains (risk of bias, consistency, directness, and precision) separately for randomized controlled trials (RCTs) and observational studies and (2) developing an overall strength of evidence grade, given the scores for each of these domains. ResultsConclusions about overall strength of evidence reached by experienced systematic reviewers based on the same evidence can differ greatly, especially for complex bodies of evidence. Current instructions may be sufficient for straightforward quantitative evaluations that use meta-analysis for summarizing RCT findings. In contrast, agreement suffered when evaluations did not lend themselves to meta-analysis and reviewers needed to rely on their own qualitative judgment. Three areas raised particular concern: (1) evidence from a combination of RCTs and observational studies, (2) outcomes with differing measurement, and (3) evidence that appeared to show no differences in outcomes. ConclusionInterrater reliability was highly variable for scoring strength of evidence domains and combining scores to reach overall strength of evidence grades. Future research can help in establishing improved methods for evaluating these complex bodies of evidence.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.