Abstract

Additive regression models have a long history in multivariate non-parametric regression. They provide a model in which the regression function is decomposed as a sum of functions, each of them depending only on a single explanatory variable. The advantage of additive models over general non-parametric regression models is that they allow to obtain estimators converging at the optimal univariate rate avoiding the so-called curse of dimensionality. Beyond backfitting, marginal integration is a common procedure to estimate each component in additive models. In this paper, we propose a robust estimator of the additive components which combines local polynomials on the component to be estimated with the marginal integration procedure. The proposed estimators are consistent and asymptotically normally distributed. A simulation study allows to show the advantage of the proposal over the classical one when outliers are present in the responses, leading to estimators with good robustness and efficiency properties.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.