Abstract
To achieve robustness against the outliers or heavy-tailed sampling distribution, we consider an Ivanov regularized empirical risk minimization scheme associated with a modified Huber's loss for nonparametric regression in reproducing kernel Hilbert space. By tuning the scaling and regularization parameters in accordance with the sample size, we develop nonasymptotic concentration results for such an adaptive estimator. Specifically, we establish the best convergence rates for prediction error when the conditional distribution satisfies a weak moment condition.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have