Abstract

BackgroundThe quality of harms reporting in journal publications is often poor, which can impede the risk-benefit interpretation of a clinical trial. Clinical study reports can provide more reliable, complete, and informative data on harms compared to the corresponding journal publication. This case study compares the quality and quantity of harms data reported in journal publications and clinical study reports of orlistat trials.MethodsPublications related to clinical trials of orlistat were identified through comprehensive literature searches. A request was made to Roche (Genentech; South San Francisco, CA, USA) for clinical study reports related to the orlistat trials identified in our search. We compared adverse events, serious adverse events, and the reporting of 15 harms criteria in both document types and compared meta-analytic results using data from the clinical study reports against the journal publications.ResultsFive journal publications with matching clinical study reports were available for five independent clinical trials. Journal publications did not always report the complete list of identified adverse events and serious adverse events. We found some differences in the magnitude of the pooled risk difference between both document types with a statistically significant risk difference for three adverse events and two serious adverse events using data reported in the clinical study reports; these events were of mild intensity and unrelated to the orlistat. The CONSORT harms reporting criteria were often satisfied in the methods section of the clinical study reports (70–90 % of the methods section criteria satisfied in the clinical study reports compared to 10–50 % in the journal publications), but both document types satisfied 80–100 % of the results section criteria, albeit with greater detail being provided in the clinical study reports.ConclusionsIn this case study, journal publications provided insufficient information on harms outcomes of clinical trials and did not specify that a subset of harms data were being presented. Clinical study reports often present data on harms, including serious adverse events, which are not reported or mentioned in the journal publications. Therefore, clinical study reports could support a more complete, accurate, and reliable investigation, and researchers undertaking evidence synthesis of harm outcomes should not rely only on incomplete published data that are presented in the journal publications.Electronic supplementary materialThe online version of this article (doi:10.1186/s13063-016-1327-z) contains supplementary material, which is available to authorized users.

Highlights

  • The quality of harms reporting in journal publications is often poor, which can impede the risk-benefit interpretation of a clinical trial

  • When both document types reported on any particular individual criteria (i.e. BOTH), the reported information was compared and classified as follows: Clinical Study Reports (CSRs) (+) – The CSR provides more information than the journal publication Similar (O) – Both document types provide equal and similar information CSR (-) – The journal publication provides more information than the CSR

  • This case study confirms that CSRs can provide more complete and robust information on harms data collected in clinical trials, compared to publically available journal publications

Read more

Summary

Introduction

The quality of harms reporting in journal publications is often poor, which can impede the risk-benefit interpretation of a clinical trial. Clinical study reports can provide more reliable, complete, and informative data on harms compared to the corresponding journal publication. Results in the past have found reporting in journal publications to be inadequate and inconsistent [5], and clinical trial registries have been responsible for making major strides in improving the transparency of trial data, a recent study suggested that the results from trial registries often remain unavailable [6]. CSRs are an ‘integrated’ full report, which can be up to a thousand pages in length, and include extensive detailed information on the efficacy and harms of interventions. The information in these documents relating to harms is usually separated individually by adverse event (AE) and serious adverse event (SAE) terms in summary tables and listings

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call