Abstract

Sufficient dimension reduction is intended to project high‐dimensional predictors onto a low‐dimensional space without loss of information on the responses. Classical methods, such as sliced inverse regression, sliced average variance estimation and directional regression, are backbones of many modern sufficient dimension methods and have gained considerable research interests. However, the efficiency of such methods will be shrunk when dealing with sparse models. Under given models or some strict sparsity assumptions, there are existing sparse sufficient dimension methods in the literature. In order to relax the model assumptions and sparsity, in this paper, we define a general least squares objective function, which is applicable to all kernel matrices of classical sufficient dimension reduction methods, and propose a Mallows model averaging based sufficient dimension reduction method. Furthermore, an iterative least squares algorithm is used to obtain the sample estimates. Our method demonstrates excellent performance in simulation results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call