Abstract

The article provides an insight into three most popular methods for data dimension reduction: principal component analysis (PCA), exploratory factor analysis (EFA) and linear discriminant analysis (LDA). First, the basic definitions and notations of the data matrices are presented and illustrated with a small real data set. Next, some essential facts from the matrix theory are summarized. The two main tools for dimension reduction, eigenvalue and singular value decomposition, help to explain how PCA and EFA work with a correlation and a row data matrix. To see them in action, PCA and EFA are applied to the small data set, and the results are interpreted and compared. Finally, LDA achieves dimension reduction of data with observations divided into several groups is considered. As PCA and EFA, the method performs either with between- and within-groups covariance matrices, or, more economically, with the row data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call