Abstract

Purpose: This purpose of this study was to illustrate how publication of interim analyses of randomized clinical trials (RCTs) can cause problems in the interpretation of final results. Method: The effect of publishing interim analyses on the results of a typical HIV RCT comparing regimens of registered antiretroviral drugs was illustrated using a simulation study. Simulations modeled an RCT comparing the effect of two treatment combinations on changes in log HIV viral load from baseline. Publication of interim results at 6 months was assumed to lead to 50% of patients switching from the poorer treatment if interim results were statistically significant (p < .05), 20% of patients switching from the poorer treatment if interim results were marginally significant (.05 < p < .20), and 10% of all patients switching treatment if interim results were not statistically significant. Three scenarios were simulated: a large treatment difference (0.4 log HIV viral load), a moderate difference (0.2 log), and no treatment difference (0.0 log). Results: The simulation study showed that if the true treatment difference was large (0.4 log) the power of the trial was reduced from over 80% at 6 months to under 37% at 12 months. Furthermore, given the statistical significance of the interim analysis results at 6 months, the simulations illustrated that the trial results would appear similar at 12 months, regardless of the true underlying treatment difference. Conclusion: The simulations reinforce the fact that publication of interim analyses of RCTs can affect the future conduct of a trial and make interpretation of final results difficult.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call