Abstract

We study the sequential change detection problem, with incomplete knowledge of source statistics, through the use of universal estimators of entropy and divergence rate. A novel technique, to reduce the time complexity of JB-Page change detection test(Jacob-Bansal(2008)), is described and a lemma justifying the method is proved. Inspired by the Page(1954) test, we propose a test, to detect a change from a stationary Markov ψ-mixing process to a stationary ergodic process. Statistics of both the sources are unknown except for a training sequence for the source before change. The test uses a universal estimator of the divergence rate between a stationary ergodic process and a stationary Markov ψ-mixing process, which we propose and prove to be almost-surely convergent. It is based on the Fixed-Database-Lempel-Ziv(FDLZ) cross-parsing technique. The proof of convergence of our estimator of divergence rate uses the almost-sure convergence of a match-length like quantity between a stationary ergodic process and a stationary Markov ψ-mixing process which we establish here.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call