Abstract

In this study, we address the issue of tracking changes in statistical models under the assumption that the statistical models used for generating data may change over time. This issue is of great importance for learning from non-stationary data. One of the promising approaches for resolving this issue is the use of the dynamic model selection (DMS) method, in which a model sequence is estimated on the basis of the minimum description length (MDL) principle. Another approach is the use of the infinite hidden Markov model (HMM), which is a non-parametric learning method for the case with an infinite number of states. In this study, we propose a few new variants of DMS and propose efficient algorithms to minimize the total code-length by using the sequential normalized maximum likelihood. We compare these algorithms with infinite HMM to investigate their statistical model change detection performance, and we empirically demonstrate that one of our variants of DMS significantly outperforms infinite HMM in terms of change-point detection accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call