Abstract

Classification systems based on linear discriminant analysis are employed in a variety of communications applications, in which the classes are most commonly characterized by known Gaussian PDFs. The performance of these classifiers is analyzed in this paper in terms of the conditional probability of misclassification. Easily computed lower and upper bounds on this error probability are presented and shown to provide corresponding bounds on the number of Monte Carlo trials required to obtain a desired level of accuracy. The error probability bounds yield an exact and easily computed expression for the error probability in the case where there are only two classes and a single hyperplane. In the special case where misclassification into a nominated class is independent of all other misclassifications, successively tighter upper and lower bounds can be computed at the expense of successively higher-order products of the individual misclassification probabilities. Finally, bounds are provided on the number of Monte Carlo trials required to improve, with suitably high confidence level, on the confidence interval formed by the error probability bounds.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call