Abstract

In this paper, we address the problem of nonparametric regression estimation in the infinite-dimensional setting. We start by extending the Stone's seminal result to the case of metric spaces when the probability measure of the explanatory variables is tight. Then, under slight variations on the hypotheses, we state and prove the theorem for general metric measure spaces. From this result, we derive the mean square consistency of the k-NN and kernel estimators if the regression function is bounded and the Besicovitch condition holds. We also prove that, for the uniform kernel estimate, the Besicovitch condition is also necessary in order to attain L1 consistency for almost every x.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.