Abstract
We consider a sequential Bayesian changepoint detection problem for a general stochastic model, assuming that the observed data may be dependent and non-identically distributed and the prior distribution of the change point is arbitrary, not necessarily geometric. Tartakovsky and Veeravalli (2005) developed a general asymptotic theory of changepoint detection in the case of non-identically distributed and dependent observations and discrete time, and Baron and Tartakovsky (2006) in continuous time assuming the certain stability of the log-likelihood ratio process. This stability property was formulated in terms of the $r$ -quick convergence of the normalized log-likelihood ratio process to a positive and finite number, which can be interpreted as the limiting Kullback–Leibler information between the “change” and “no change” hypotheses. In these papers, it was conjectured that the $r$ -quick convergence can be relaxed in the $r$ -complete convergence, which is typically much easier to verify in particular examples. In the present paper, we justify this conjecture by showing that the Shiryaev change detection procedure is nearly optimal, minimizing asymptotically to first order (as the probability of false alarm vanishes) moments of the delay to detection up to order $r$ whenever $r$ -complete convergence holds. We also study asymptotic properties of the Shiryaev–Roberts detection procedure in the Bayesian context.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.