Abstract

Modern multiscale type segmentation methods are known to detect multiple change-points with high statistical accuracy, while allowing for fast computation. Underpinning theory has been developed mainly for models that assume the signal as a piecewise constant function. In this paper this will be extended to certain function classes beyond such step functions in a nonparametric regression setting, revealing certain multiscale segmentation methods as robust to deviation from such piecewise constant functions. Our main finding is the adaptation over such function classes for a universal thresholding, which includes bounded variation functions, and (piecewise) H\"{o}lder functions of smoothness order $ 0 < \alpha \le1$ as special cases. From this we derive statistical guarantees on feature detection in terms of jumps and modes. Another key finding is that these multiscale segmentation methods perform nearly (up to a log-factor) as well as the oracle piecewise constant segmentation estimator (with known jump locations), and the best piecewise constant approximants of the (unknown) true signal. Theoretical findings are examined by various numerical simulations.

Highlights

  • Throughout we assume that observations are given through the regression model yin = fin + ξin, i = 0, . . . , n − 1, (1)where fin = n f0(x)dx, (2)[i/n,(i+1)/n) and ξn = (ξ0n, . . . , ξnn−1) are independent centered subGaussian random variables with scale parameter σ, that is,E euξin ≤ eu2σ2/2, for every u ∈ R.For simplicity, the scale parameter σ in model (1) is assumed to be known

  • We show that the multiscale change-point segmentation (MCPS) methods with a universal threshold perform nearly no worse than piecewise constant segmentation estimators whose change-point locations are provided by an oracle

  • In order to investigate the robustness of MCPS methods with respect to model misspecification, we introduce a local trend component as in Olshen et al (2004) and Zhang and Siegmund (2007) to the test signal f0 in Section 6.1, which leads to the model

Read more

Summary

Introduction

From a slightly different perspective, we show that the MCPS methods perform nearly as well as the best (deterministic) piecewise constant approximant of the true signal with the same number of jumps or less (Proposition 2) Besides such theoretical interest (cf Linton and Seo, 2014; Farcomeni, 2014), the study of these estimators in models beyond piecewise constant functions is of particular practical importance, since a piecewise constant function is known to be only an approximation of the underlying signal in many applications. We show that a large class of multiscale change-point segmentation methods with a universal parameter choice are adaptively minimax optimal (up a log-factor) for step signals (possibly with unbounded number of changepoints) and for (piecewise) smooth signals in certain approximation spaces (Theorems 1 and 2) with respect to general Lp-risk.

Multiscale change-point segmentation
Convergence rates for step functions
Robustness to model misspecification
Feature detection
Oracle segmentation
Oracle approximant
Simulation study
Stability
Different noise levels
Robustness and feature detection
Empirical convergence rates
Conclusion
Proof of Theorem 1
Proof of Theorem 2
Proof of Theorem 5

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.