Abstract

The measurement dimensionality, which maximizes the average (over possible training sets) probability of correct classification ( P cr ), is investigated for the equiprobable two-class Gaussian problem with known common covariance matrix. The Bayes minimum error classification rule, in which the estimated (sample) mean vectors are used in place of the true mean vectors, is the classification rule considered. A basic question investigated is the variation, with dimensionality, in the Mahalanobis distance (between the underlying distributions) required to keep P cr constant. Numerical results are plotted for several cases. Analytical results are obtained which relate the rate of variation of the Mahalanobis distance with dimensionality and the corresponding asymptotic behaviour of P cr . Results for more highly structured problems, involving specific covariance matrices, show that in some cases increasing correlation between the measurements yields higher values of P cr . Approximate expressions are derived relating P cr dimensionality, training sample size and the structure of the underlying probability density.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.