Abstract

We consider estimation of a step function f from noisy observations of a deconvolution ϕ*f, where ϕ is some bounded L1-function. We use a penalized least squares estimator to reconstruct the signal f from the observations, with penalty equal to the number of jumps of the reconstruction. Asymptotically, it is possible to correctly estimate the number of jumps with probability one. Given that the number of jumps is correctly estimated, we show that for a bounded kernel ϕ the corresponding estimates of the jump locations and jump heights are n−1/2 consistent and converge to a joint normal distribution with covariance structure depending on ϕ. As special case we obtain the asymptotic distribution of the least squares estimator in multiphase regression and generalizations thereof. Finally, singular integral kernels are briefly discussed and it is shown that the n−1/2-rate can be improved.

Highlights

  • Assume we have observations from a regression model given by nY = (Φf ) xi + εi, (1)i=1 where Φf = φ∗f denotes convolution of some L1-functions φ and f and ε1, ε2, . . . are i.i.d. mean zero random variables with finite second moment

  • In the following we denote model (1) as inverse regression model and we assume throughout that φ is known

  • Is n−1/2 consistent and follows a multivariate normal limit law. This is in strict contrast to the case of direct regression (where Φ in (1) is the identity)

Read more

Summary

Introduction

The space of locally constant functions as considered in this paper (albeit of dimension ∞) yields a n−1/2 rate of convergence generically which renders deconvolution in this setting as a practically feasable task In this case the correct (and finite) number of jumps will be estimated asymptotically, and the problem reduces to a (nonsmooth) nonlinear regression problem. If the smooth part of the function of interest belongs to a Paley-Wiener class, they show that a rate of min(n−1/2, n−1/(2β+1)) can be obtained up to a logarithmic factor Their recent work (Goldenshluger et al, 2006a,b) generalize these results to a unifying framework of sequence space models covering delay and amplitude estimation, estimation of change-points in derivatives and change point estimation in a convolution white noise model. For ease of notation for any a, b ∈ R, [a, b] and (a, b) always denote the intervals [min(a, b), max(a, b)] and (min(a, b), max(a, b)), respectively

Assumptions
Estimate and asymptotic results
Some technical lemmata
Entropy results
Consistency
Asymptotic normality
C23 C4t2n exp
A lower bound for estimating the jump locations
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call