Abstract
The problem of approximating a general regression function m(x) = E (Y IX = x) is addressed. As in the case of the c1assical L2-type projection pursuit regression considered by Hall (1989), we propose to approximate m(x) through a regression of Y given an index, that is a unidimensional projection of X. The orientation vector defining the projection of X is taken to be the optimum of a Kullback-Leibler type criterion. The first step of the c1assical projection pursuit regression and the single-index models (SIM) are obtained as particular cases. We define a kernel-based estimator of the 'optimal' orientation vector and we suggest a simple empirical bandwidth selection rule. Finally, the true regression function m(•) is approximated through a kernel regression of Y given the estimated index. Our procedure extends the idea of Hardle, Hall and Ichimura (1993) which propose, in the case of SIM, to minimize an empirical L2-type criterion simultaneously with respect to the orientation vector and the bandwidth. We show that a same bandwidth of order n - 1/5 can be used for the root-n estimation of the orientation and for the kernel approximation of the true regression function. Our methodology could be extended to more accurate multi-index approximations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.