Abstract

Summary Additive models are popular in high dimensional regression problems owing to their flexibility in model building and optimality in additive function estimation. Moreover, they do not suffer from the so-called curse of dimensionality generally arising in non-parametric regression settings. Less known is the model bias that is incurred from the restriction to the additive class of models. We introduce a new class of estimators that reduces additive model bias, yet preserves some stability of the additive estimator. The new estimator is constructed by localizing the additivity assumption and thus is named the local additive estimator. It follows the spirit of local linear estimation but is shown to be able to relieve partially the dimensionality problem. Implementation can be easily made with any standard software for additive regression. For detailed analysis we explicitly use the smooth backfitting estimator of Mammen, Linton and Nielsen.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call