BackgroundSystematic reviews are important for decision makers. They offer many potential benefits but are often written in technical language, are too long, and do not contain contextual details which make them hard to use for decision-making. There are many organizations that develop and disseminate derivative products, such as evidence summaries, from systematic reviews for different populations or subsets of decision makers. This systematic review aimed to (1) assess the effectiveness of evidence summaries on policymakers’ use of the evidence and (2) identify the most effective summary components for increasing policymakers’ use of the evidence. We present an overview of the available evidence on systematic review derivative products.MethodsWe included studies of policymakers at all levels as well as health system managers. We included studies examining any type of “evidence summary,” “policy brief,” or other products derived from systematic reviews that presented evidence in a summarized form. The primary outcomes were the (1) use of systematic review summaries in decision-making (e.g., self-reported use of the evidence in policymaking and decision-making) and (2) policymakers’ understanding, knowledge, and/or beliefs (e.g., changes in knowledge scores about the topic included in the summary). We also assessed perceived relevance, credibility, usefulness, understandability, and desirability (e.g., format) of the summaries.ResultsOur database search combined with our gray literature search yielded 10,113 references after removal of duplicates. From these, 54 were reviewed in full text, and we included six studies (reported in seven papers) as well as protocols from two ongoing studies. Two studies assessed the use of evidence summaries in decision-making and found little to no difference in effect. There was also little to no difference in effect for knowledge, understanding or beliefs (four studies), and perceived usefulness or usability (three studies). Summary of findings tables and graded entry summaries were perceived as slightly easier to understand compared to complete systematic reviews. Two studies assessed formatting changes and found that for summary of findings tables, certain elements, such as reporting study event rates and absolute differences, were preferred as well as avoiding the use of footnotes.ConclusionsEvidence summaries are likely easier to understand than complete systematic reviews. However, their ability to increase the use of systematic review evidence in policymaking is unclear.Trial registrationThe protocol was published in the journal Systematic Reviews (2015;4:122)Electronic supplementary materialThe online version of this article (doi:10.1186/s13012-016-0530-3) contains supplementary material, which is available to authorized users.
Read full abstract