Abstract

ObjectiveTo assess the methodological and reporting quality of Chinese- and English -language systematic reviews and meta-analyses (SRs/MAs) published by Chinese authors between 2016 and 2018. Study Design and SettingWe searched MEDLINE and Chinese Science Citation Database (CSCD) for SRs/MAs led by Chinese authors published between 2016 and 2018. We used random sampling to select 10% of the eligible SRs/MAs published in each year from CSCD, and then matched the same number of SRs/MAs in MEDLINE. Reporting quality was evaluated using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and methodological quality using the Assessment of Multiple Systematic Reviews (AMSTAR-2) tool. Stratified analyses were conducted to compare the differences of quality between Chinese- and English language SRs/MAs. ResultsWe identified 336 SRs/MAs (168 in Chinese and 168 in English). The reporting quality in Chinese-language SRs/MAs was slightly lower than English-language SRs/MAs (mean PRISMA scores: 20.58 vs. 21.71 in 2016, 19.87 vs. 21.24 in 2017, and 21.29 vs. 22.38 in 2018). Less than half of both Chinese- and English-language SRs/MAs complied with item 5 (protocol and registration), item 7 (information sources), item 8 (search) and item 27 (funding)). The methodological quality in Chinese -language SRs/MAs was also slightly lower than English -language SRs/MAs (mean AMSTAR-2 scores: 8.07 vs. 9.36 in 2016; 9.21 vs. 10.26 in 2017; 8.86 vs. 9.28 in 2018). Three items (item 2: established a protocol; item 4: use a comprehensive literature search; and item 10: report the sources of funding) were adhered to by less than 10% of both Chinese- and English -language SRs/MAs. Only one (0.6%) Chinese-language SRs/MA and nine (5.4%) English-language SRs/MAs were rated as high methodological quality. ConclusionThe reporting and methodological quality of English-language SRs/MAs conducted by authors from China between 2016 and 2018 were slightly better than those of Chinese -language SRs/MAs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call