Abstract

Sliced inverse regression [Li (1989), (1991) and Duan and Li (1991)] is a nonparametric method for achieving dimension reduction in regression problems. It is widely applicable, extremely easy to implement on a computer and requires no nonparametric smoothing devices such as kernel regression. If $Y$ is the response and $X \in \mathbf{R}^p$ is the predictor, in order to implement sliced inverse regression, one requires an estimate of $\Lambda = E\{\operatorname{cov}(X\mid Y)\} = \operatorname{cov}(X) - \operatorname{cov}\{E(X\mid Y)\}$. The inverse regression of $X$ on $Y$ is clearly seen in $\Lambda$. One such estimate is Li's (1991) two-slice estimate, defined as follows: The data are sorted on $Y$, then grouped into sets of size 2, the covariance of $X$ is estimated within each group and these estimates are averaged. In this paper, we consider the asymptotic properties of the two-slice method, obtaining simple conditions for $n^{1/2}$-convergence and asymptotic normality. A key step in the proof of asymptotic normality is a central limit theorem for sums of conditionally independent random variables. We also study the asymptotic distribution of Greenwood's statistics in nonuniform cases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call