Abstract

In this paper, we theoretically analyze some properties that relate Fisher's classifier and the optimal quadratic classifier, when the latter is derived utilizing a particular covariance matrix for the classes. We propose an efficient approach which is used to select the threshold after a linear transformation onto the one-dimensional space is performed. We achieve this by selecting the decision boundary that minimizes the classification error in the transformed space, assuming that the univariate random variables are normally distributed. Our empirical results on synthetic and real-life data sets show that our approach lead to smaller classification error than the traditional Fisher's classifier. The results also demonstrate that minimizing the classification error in the transformed space leads to smaller classification error in the original multi-dimensional space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call