Abstract

Dimension reduction plays an important role in high-dimensional data analysis. Principal component analysis, independent component analysis, and sliced inverse regression (SIR) are well known but very different analysis tools for the dimension reduction. It appears that these three approaches can all be seen as a comparison of two different scatter matrices S1 and S2. The components for dimension reduction are then given by the eigenvectors of . In SIR, the second scatter matrix is supervised and therefore the choice of the components is based on the dependence between the observed random vector and a response variable. Based on these notions, we extend the invariant coordinate selection (ICS), allowing the second scatter matrix S2 to be supervised; supervised ICS can then be used in supervised dimension reduction. It is remarkable that many supervised dimension reduction methods proposed in the literature such as the linear discriminant analysis, canonical correlation analysis, SIR, sliced average variance estimate, directional regression, and principal Hessian directions can be reformulated in this way. Several families of supervised scatter matrices are discussed, and their use in supervised dimension reduction is illustrated with a real data example and simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call