Abstract
For the problem of classifying an element (e.g., an unknown pattern) into one of two given categories where the associated observables are distributed according to one of two known multivariate normal populations having a common covariance matrix, it is shown that the minimum Bayes risk is a strict monotonic function of certain separability or statistical distance measures regardless of the a priori probabilities and the assigned loss function. However, for the associated conditional expected losses, strict monotonicity holds, if and only if a certain condition dependent on these probabilities and the given loss function is satisfied. These results remain valid for classification problems in which the observable can be transformed by a one-to-one differentiable mapping to normality.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Pattern Analysis and Machine Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.