Abstract
We study the sequential change detection problem, with incomplete knowledge of source statistics, through the use of universal estimators of entropy and divergence rate. A novel technique, to reduce the time complexity of JB-Page change detection test(Jacob-Bansal(2008)), is described and a lemma justifying the method is proved. Inspired by the Page(1954) test, we propose a test, to detect a change from a stationary Markov ψ-mixing process to a stationary ergodic process. Statistics of both the sources are unknown except for a training sequence for the source before change. The test uses a universal estimator of the divergence rate between a stationary ergodic process and a stationary Markov ψ-mixing process, which we propose and prove to be almost-surely convergent. It is based on the Fixed-Database-Lempel-Ziv(FDLZ) cross-parsing technique. The proof of convergence of our estimator of divergence rate uses the almost-sure convergence of a match-length like quantity between a stationary ergodic process and a stationary Markov ψ-mixing process which we establish here.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.