Abstract

The L^{ \alpha} -distance between posterior density functions (PDF's) is proposed as a separability measure to replace the probability of error as a criterion for feature extraction in pattern recognition. Upper and lower bounds on Bayes error are derived for \alpha > 0 . If \alpha = 1 , the lower and upper bounds coincide; an increase (or decrease) in \alpha loosens these bounds. For \alpha = 2 , the upper bound equals the best commonly used bound and is equal to the asymptotic probability of error of the first nearest neighbor classifier. The case when \alpha = 1 is used for estimation of the probability of error in different problem situations, and a comparison is made with other methods. It is shown how unclassified samples may also be used to improve the variance of the estimated error. For the family of exponential probability density functions (pdf's), the relation between the distance of a sample from the decision boundary and its contribution to the error is derived. In the nonparametric case, a consistent estimator is discussed which is computationally more efficient than estimators based on Parzen's estimation. A set of computer simulation experiments are reported to demonstrate the statistical advantages of the separability measure with \alpha = 1 when used in an error estimation scheme.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.