Abstract

Orthogonal transformation can delete the correlations among candidate features such that the extracted features do not disturb each other. An orthogonal set of discriminant vectors is more powerful than the classical discriminant vectors. In this paper, we present a new orthogonal linear discriminant analysis (OLDA) model based on least-squares approximation called LS-OLDA for pattern classification, which aims to find an orthogonal transformation W and a diagonal matrix D such that the difference between [Formula: see text] and WDWT is minimized in the least-squares sense, and the trace of D is maximized simultaneously. Theoretical analysis shows that the proposed model coincides with classical OLDA criterion. The experimental results on different standard data sets compared with related methods show that LS-OLDA achieves or approximates closely to the best accuracy, and has lower computational cost.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call