Abstract

This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymp- totically up to first order, each additive component can be estimated as well as it could be by a least squares estimator if the other components were known. The algorithm for the calculation of the estimator uses backfitting. Conver- gence of the algorithm is shown. Finite sample properties are also compared through simulation experiments. In this paper we discuss nonparametric additive monotone regression models. We discuss a backfitting estimator that is based on iterative application of the pool adjacent violator algorithm to the additive components of the model. Our main result states the following oracle property. Asymptotically up to first order, each additive component is estimated as well as it would be (by a least squares estimator) if the other components were known. This goes beyond the classical finding that the estimator achieves the same rate of convergence, independently of the number of additive components. The result states that the asymptotic distribution of the estimator does not depend on the number of components. We have two motivations for considering this model. First of all we think that this is a useful model for some applications. For a discussion of isotonic additive regres- sion from a more applied point, see also Bacchetti (1), Morton-Jones et al. (32) and De Boer, Besten and Ter Braak (7). But our main motivation comes from statistical theory. We think that the study of nonparametric models with several nonparamet- ric components is not fully understood. The oracle property that is stated in this paper for additive isotone models has been shown for smoothing estimators in some other nonparametric models. This property is expected to hold if the estimation of the different nonparametric components is based on local smoothing where the lo- calization takes place in different scales. An example are additive models of smooth functions where each localization takes place with respect to another covariate. In Mammen, Linton and Nielsen (28) the oracle property has been verified for the lo- cal linear smooth backfitting estimator. As local linear estimators, also the isotonic least squares is a local smoother. The estimator is a local average of the response variable but in contrast to local linear estimators the local neighborhood is chosen � Research of this paper was supported by the Deutsche Forschungsgemeinschaft project MA

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.