Abstract

Classical change point analysis aims at (1) detecting abrupt changes in the mean of a possibly nonstationary time series and at (2) identifying regions where the mean exhibits a piecewise constant behavior. In many applications however, it is more reasonable to assume that the mean changes gradually in a smooth way. Those gradual changes may either be nonrelevant (i.e., small), or relevant for a specific problem at hand, and the present paper presents statistical methodology to detect the latter. More precisely, we consider the common nonparametric regression model Xi=μ(i/n)+εi with centered errors and propose a test for the null hypothesis that the maximum absolute deviation of the regression function μ from a functional g(μ) (such as the value μ(0) or the integral ∫01μ(t)dt) is smaller than a given threshold on a given interval [x0,x1]⊆[0,1]. A test for this type of hypotheses is developed using an appropriate estimator, say dˆ∞,n, for the maximum deviation d∞=supt∈[x0,x1]|μ(t)−g(μ)|. We derive the limiting distribution of an appropriately standardized version of dˆ∞,n, where the standardization depends on the Lebesgue measure of the set of extremal points of the function μ(·)−g(μ). A refined procedure based on an estimate of this set is developed and its consistency is proved. The results are illustrated by means of a simulation study and a data example.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.