Abstract

Quadratic discriminant analysis (QDA) is a widely used classification technique that generalizes the linear discriminant analysis (LDA) classifier to the case of distinct covariance matrices among classes. For the QDA classifier to yield high classification performance, an accurate estimation of the covariance matrices is required. Such a task becomes all the more challenging in high dimensional settings, wherein the number of observations is comparable with the feature dimension. A popular way to enhance the performance of QDA classifier under these circumstances is to regularize the covariance matrix, giving the name regularized QDA (R-QDA) to the corresponding classifier. In this work, we consider the case in which the population covariance matrix has a spiked covariance structure, a model that is often assumed in several applications. Building on the classical QDA, we propose a novel quadratic classification technique, the parameters of which are chosen such that the fisher-discriminant ratio is maximized. Numerical simulations show that the proposed classifier not only outperforms the classical R-QDA for both synthetic and real data but also requires lower computational complexity, making it suitable to high dimensional settings.

Highlights

  • Classification is among the most typical examples of supervised learning techniques

  • When the data is normally distributed with common covariance matrices across classes, linear discriminant analysis (LDA) is known to be the optimal classifier in terms of misclassification rate minimization

  • It can be more advisable to employ the quadratic discriminant analysis (QDA), which turns out to be the optimal classifier under Gaussian data and known statistics

Read more

Summary

INTRODUCTION

Classification is among the most typical examples of supervised learning techniques. When the data is normally distributed with common covariance matrices across classes, linear discriminant analysis (LDA) is known to be the optimal classifier in terms of misclassification rate minimization. In the case of different covariances across classes, it has recently been shown that the use of LDA does not enable to leverage the information on the differences between covariance matrices [1] Under such circumstances, it can be more advisable to employ the quadratic discriminant analysis (QDA), which turns out to be the optimal classifier under Gaussian data and known statistics. We further assume that the population covariance matrix associated with each class is a low-rank perturbation of a scaled identity; that is, it is isotropic except for a finite number of symmetry-breaking directions Such a model is used in many real applications such as detection [10], electroencephalogram (EEG) signals [11], [12], and financial econometrics [13], [14], and is known in the random matrix theory terminology as the spiked covariance model.

NOTATIONS
QUADRATIC DISCRIMINANT ANALYSIS
PROPOSED CLASSIFICATION RULE
PARAMETER OPTIMIZATION
NUMERICAL SIMULATIONS
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.