Abstract

Fisher discriminant analysis (FDA) is a very famous analysis method for classification. However, FDA does give an optimal projection only for Gaussian distributions with equal covariance matrices. In other words, FDA is not optimal in the case of heteroscedastic Gaussian distributions. In this paper, we propose a novel criterion for FDA including a correction term based on the Bhattacharyya distance which is closely related to classification rate. Furthermore, the Chernoff distance based criterion and its kernelized version are proposed as its extension. These proposed criteria have three strong points. The first one is that the correction term based on the Bhattacharyya distance can deal with heteroscedastic Gaussian distributions. The second one is that the correction term based on the Chernoff distance can handle easily the difference in the number of class samples. The third point is that their kernel extensions are easily implemented to be applied to non-Gaussian distributions. As a result, the proposed method is applicable in a wide variety of classification problems. Experimental results using toy simulations and nine kinds of UCI real-world datasets show marked advantages of the proposed method over the conventional FDA.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.