Abstract

Kernel Fisher discriminant analysis (KFDA) faces the problem of kernel parameters selection. A novel KFDA kernel parameters optimization criterion is presented for maximizing the uniformity of class-pair separabilities and class separability in kernel space simultaneously. The presented criterion is also applied to the kernel parameters selection of spectral regression kernel discriminant analysis (SRKDA). Minimum distance classifier, k nearest neighbor (kNN) classifier, and naive Bayes classifier are used to evaluate the feature extraction performance. Experiments on fourteen benchmark multiclass data sets show that, comparing with the criterion for merely maximizing the class separability in kernel space, the presented criterion can search the optimum KFDA kernel parameters more accurately, and do better in SRKDA kernel parameters selection.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.