Abstract

Computer simulation was applied to Sartwell's model to examine the impact of competing risks of death on the underlying assumptions and the power to reject both uniform and normal incubation period distributions. Exponential and nonparametric survival functions were imposed onto lognormal, uniform, and normal distributions to create random samples reflecting competing risk. These random samples were evaluated with the Shapiro-Wilk's test to determine the proportion for which the lognormal distribution was rejected. The simulations indicated that competing causes of death do not significantly alter the lognormal distribution of incubation periods. In only approximately 5% of the samples drawn from a lognormal distribution was a lognormal hypothesis rejected with a goodness-of-fit test when sample size varied from 20 to 500. There was generally good power (> 80%) to reject a lognormal distribution if the random samples were generated from a uniform distribution of incubation times, but not when they were generated from a normal distribution, particularly with increasing ages at disease onset. Varying the standard deviation did not significantly change the simulation results if the random samples came from a lognormal or uniform distribution. These conclusions were further supported by application of Sartwell's model to published data on the ages of onset for several chronic diseases.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call