Abstract

In the paper, we discuss dimension reduction of predictors X ∈ R p in a regression of Y|X with a notion of sufficiency that is called sufficient dimension reduction. In sufficient dimension reduction, the original predictors X are replaced by its lower-dimensional linear projection without loss of information on selected aspects of the conditional distribution. Depending on the aspects, the central subspace, the central mean subspace and the central k th -moment subspace are defined and investigated as primary interests. Then the relationships among the three subspaces and the changes in the three subspaces for non-singular transformation of X are studied. We discuss the two conditions to guarantee the existence of the three subspaces that constrain the marginal distribution of X and the conditional distribution of Y|X. A general approach to estimate them is also introduced along with an explanation for conditions commonly assumed in most sufficient dimension reduction methodologies.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.