Ever since the pattern of localized extinction associated with measles was discovered by Bartlett in 1957, many models have been developed in an attempt to reproduce this phenomenon. Recently, the use of constant infectious and incubation periods, rather than the more convenient exponential forms, has been presented as a simple means of obtaining realistic persistence levels. However, this result appears at odds with rigorous mathematical theory; here we reconcile these differences. Using a deterministic approach, we parameterize a variety of models to fit the observed biennial attractor, thus determining the level of seasonality by the choice of model. We can then compare fairly the persistence of the stochastic versions of these models, using the 'best-fit' parameters. Finally, we consider the differences between the observed fade-out pattern and the more theoretically appealing 'first passage time'.
Read full abstract