Abstract

Sliced Inverse Regression (S.I.R.) is a method for reducing the dimension of the explanatory variable x in nonparametric regression problems. Li (1991) considers a general regression model of the form with an arbitrary and unknown link function g, and studies a link-free and distribution-free method for estimating E, the space spanned by the βk's, called the effective dimension reduction (e.d.r.) space. It is widely applicable, easy to implement on a computer and requires no nonparametric smoothing devices such as kernel regression. The method begins with a partition of the range of y into a fixed number of slices. Let us denote T(.) this partition. The conditional mean of x given T(y) is then estimated by the sample mean within each slice. After that the covariance matrix of the conditional mean , is estimated by , the sample covariance matrix of all slice's conditional means. Let us denote σ (resp. ) the theoretical (resp. sample) covariance matrix of x. Finally the K eigenvectors associated with the largest K eigenvalues of span the e.d.r, space, and the K eigenvectors associated with the largest K eigenval ues of give an estimate of a basis of E. In this paper, we establish the asymptotic distribution of the aforedefined estimator of a basis of E. The asymptotic distributions of the associated eigenprojector, eigenvalues and eigenvectors are obtained.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call