Abstract

We are concerned with the issue of detecting model changes in probability distributions. We specifically consider the strategies based on the minimum description length (MDL) principle. We theoretically analyze their basic performance from the two aspects: data compression and hypothesis testing. From the view of data compression, we derive a new bound on the minimax regret for model changes. Here, the mini-max regret is defined as the minimum of the worst-case code-length relative to the least normalized maximum likelihood code-length over all model changes. From the view of hypothesis testing, we reduce the model change detection into a simple hypothesis testing problem. We thereby derive upper bounds on error probabilities for the MDL-based model change test. The error probabilities are valid for finite sample size and are related to the information-theoretic complexity as well as the discrepancy measure of the hypotheses to be tested.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call