Abstract

In this paper a new method for monotone estimation of a regression function is proposed. The estimator is obtained by the combination of a density and a regression estimate and is appealing to users of conventional smoothing methods as kernel estimators, local polynomials, series estimators or smoothing splines. The main idea of the new approach is to construct a density estimate from the estimated values ˆ m(i/N )( i =1 ,...,N ) of the regression function to use these “data” for the calculation of an estimate of the inverse of the regression function. The final estimate is then obtained by a numerical inversion. Compared to the conventially used techniques for monotone estimation the new method is computationally more efficient, because it does not require constrained optimization techniques for the calculation of the estimate. We prove asymptotic normality of the new estimate and compare the asymptotic properties with the unconstrained estimate. In particular it is shown that for kernel estimates or local polynomials the monotone estimate is first order asymptotically equivalent to the unconstrained estimate. We also illustrate the performance of the new procedure by means of a simulation study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call