Abstract

In the paper, we discuss dimension reduction of predictors X ∈ R p in a regression of Y|X with a notion of sufficiency that is called sufficient dimension reduction. In sufficient dimension reduction, the original predictors X are replaced by its lower-dimensional linear projection without loss of information on selected aspects of the conditional distribution. Depending on the aspects, the central subspace, the central mean subspace and the central k th -moment subspace are defined and investigated as primary interests. Then the relationships among the three subspaces and the changes in the three subspaces for non-singular transformation of X are studied. We discuss the two conditions to guarantee the existence of the three subspaces that constrain the marginal distribution of X and the conditional distribution of Y|X. A general approach to estimate them is also introduced along with an explanation for conditions commonly assumed in most sufficient dimension reduction methodologies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call