Abstract

This chapter discusses the ways of evaluating lower bounds for the probability of correct decision under the minimum-distance classification or topothetical rule. It also discusses the assumption that the populations are apart by certain given minimum distance. The desirable properties of decision rules based on sample analogues of the Mahalanobis distance result from the fact that it emerges as the natural measure of dissimilarity between homoscedastic normal populations. Thus, it is equivalent to the Kullback–Leibler information measure, to Jeffreys divergence, and to Bhattacharya's measure of divergence between two densities. Analogues of the Mahalanobis distance also appear in the discrimination problem for infinite-dimensional normal distributions, that is, Gaussian processes. A special type of stochastic process that admits the same treatment as the finite-dimensional normal case is the normal p-dimensional diffusion process or Wiener process with drift. The chapter also discusses the reduction of the corresponding topothetical or identification problem to the standard p-variate normal case.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.