Abstract

We consider the problem of estimating the location of a change point \(\theta _0\) in a regression model. Most change point models studied so far were based on regression functions with a jump. However, we focus on regression functions, which are continuous at \(\theta _0\). The degree of smoothness \(q_0\) has to be estimated as well. We investigate the consistency with increasing sample size \(n\) of the least squares estimates \((\hat{\theta }_n,\hat{q}_n)\) of \((\theta _0, q_0)\). It turns out that the rates of convergence of \(\hat{\theta }_n\) depend on \(q_0\): for \(q_0\) greater than \(1/2\) we have a rate of \(\sqrt{n}\) and the asymptotic normality property; for \(q_0\) less than \(1/2\) the rate is \(\displaystyle n^{1/(2q_0+1)}\) and the change point estimator converges to a maximizer of a Gaussian process; for \(q_0\) equal to \(1/2\) the rate is \(\sqrt{n \cdot \mathrm{ln}(n)}\). Interestingly, in the last case the limiting distribution is also normal.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.