Abstract
Systematic reviews (SRs) form an important part of National Institute for Health and Care Excellence (NICE) single technology appraisal (STA) manufacturer submissions. To minimise publication bias when conducting SRs, supplementary searches should be conducted, and should follow the same principles of transparency and reproducibility as database searches. This study aimed to evaluate supplementary search methods used in NICE STA manufacturer submissions. NICE STAs published between 2011 and 2015 were reviewed. Supplementary search details from manufacturer submissions and related critique from corresponding evidence review group (ERG) reports were extracted. Searches were deemed reproducible if the minimum amount of information required to reproduce searches was reported. Of 126 STAs identified, 80 were excluded: appraisal reviews/updates (n=20); appraisal terminated (n=12); no full submission available (n=9); appendices (containing search methods) not published online (n=39). Of 46 included manufacturer submissions, 28 reported conference searches, of which 24 provided enough information for searches to be reproduced. Twenty-one reported clinical trials registry searches, but only seven provided enough information to reproduce these. Thirty-six reported conducting other manual searches, including: manufacturer internal databases (n=24); reference lists (n=20); regulatory body websites (n=11); other websites (n=5); internal experts (n=2). Evidence review groups critiqued omission of supplementary searches in 8 of 18 submissions which lacked searches of conference proceedings, and in 8 of 25 submissions which did not report searching clinical trial registries. The evaluation methods differed between ERGs. Principles of transparency and reproducibility were not followed in the majority of manufacturer submissions where supplementary searches were conducted. However, the results from this study are limited due to the low number of appendices published online. Supplementary search methods used in manufacturer submissions should be reported in full and ERGs should be consistent with critique of supplementary search methodology to ensure no evidence is omitted in decision making.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have