Abstract
For nonparametric regression estimation, when the unknown function belongs to a Sobolev smoothness class, sharp risk bounds for integrated mean square error have been found recently which improve on optimal rates of convergence results. The key to these has been the fact that under normality of the errors, the minimax linear estimator is asymptotically minimax in the class of all estimators. We extend this result to the nonnormal case, when the noise distribution is unknown. The pertaining lower asymptotic risk bound is established, based on an analogy with a location model in the independent identically distributed case. Attainment of the bound and its relation to adaptive optimal smoothing are discussed.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.