Abstract

This correspondence describes extensions to Fisher's linear discriminant function which allow both differences in class means and covariances to be systematically included in a process for feature reduction. It is shown how the Fukunaga-Koontz transform can be combined with Fisher's method to allow a reduction of feature space from many dimensions to two. Performance is seen to be superior in general to the Foley-Sammon method. The technique is developed to show how a new radius vector (or pair of radius vectors) can be combined with Fisher's vector to produce a classifier with even more power of discrimination. Illustrations of the technique show that good discrimination can be obtained even if there is considerable overlap of classes in any one projection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call