Abstract

Professor Cook is to be congratulated for his ground breaking work in dimension reduction in regression. The paper develops a general theoretical foundation for studying principal components and other dimension re duction methods in a regression context. This frame work yields a basis for elucidating the strengths, weak nesses and relationships among the various dimension reduction methods, including ordinary least squares (OLS), principal components regression (PCR), sliced inverse regression (SIR), parametric inverse regression and partial least squares. The promising new method, principal fitted components (PFC), appears to outper form some long-standing approaches such as PCR, OLS and SIR. Finally, as a result of this contribution, the standard approach to regression, with its emphasis on fixed predictors and the need to assume away the randomness of X, and the standard approach to princi pal components, with its focus on the correlation ma trix rather than the covariance matrix, both seem to be under question. Specific contributions of Professor Cook's paper include the following: (1) It provides a theoretical foundation for the widely used principal components regression. (2) It resorts to a model and thus a like lihood function, through the inverse regression of pre dictors given response, to study sufficient reduction in a forward regression problem. Consequently, likelihood based inferences can be developed, and the inferen tial capabilities of dimension reduction are moved closer to mainstream regression methodology. (3) It permits extension to categorical or mixtures of contin uous and categorical predictors, an area that most ex isting model-free dimension reduction approaches do

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call