Abstract
The problem of estimating a smooth monotone regression function $m$ will be studied. We will consider the estimator $m_{SI}$ consisting of a smoothing step (application of a kernel estimator based on a kernel $K$) and of a isotonisation step (application of the pool adjacent violator algorithm). The estimator $m_{SI}$ will be compared with the estimator $m_{IS}$ where these two steps are interchanged. A higher order stochastic expansion of these estimators will be given which show that $m_{SI}$ and $m_{SI}$ are asymptotically first order equivalent and that $m_{IS}$ has a smaller mean squared error than $m_{SI}$ if and only if the kernel function of the kernel estimator is not too smooth.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.