Abstract

Linear Discriminant Analysis (LDA) yields the optimal Bayes classifier for binary classification for normally distributed classes with equal covariance. To improve the performance of LDA, heteroscedastic LDA (HLDA) that removes the equal covariance assumption has been developed. In this paper, we show using first and second-order optimality conditions that the existing approaches either have no principled computational procedure for optimal parameter selection, or underperform in terms of the accuracy of classification and the area under the receiver operating characteristics curve (AUC) under class imbalance. Using the same optimality conditions, we then derive a dynamic Bayes optimal linear classifier for heteroscedastic LDA that is optimised via an efficient iterative procedure, which is robust against class imbalance. Experimental work is conducted on two artificial and eight real-world datasets. Our results show that the proposed algorithm compares favourably with the existing heteroscedastic LDA procedures as well as the linear support vector machine (SVM) in terms of the error rate, but is superior to all the algorithms in terms of the AUC under class imbalance. The fast training time of the proposed algorithm also encourages its use for large-data applications that show high incidence of class imbalance, such as in human activity recognition.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call