Abstract

The Anderson discriminant function has a number of properties useful for solving classification problems and for evaluating posterior class probabilities. As the mathematical formalism, we use the same weighted least-squares method to approximate the Anderson discriminant function in the neighborhood of zero values both in solving the classification problem and in evaluating posterior probabilities of classes at a given point in the feature space. In the support vector method, the classification problem is solved by solving a quadratic programming problem with the number of constraints equal to the number of rows in the training sample, and to evaluate posterior probabilities of the classes an additional tool is used, namely the Platt calibrator, which converts the distance of the point to the boundary to the posterior probability of the class; the calibrator’s parameters are found with the maximum likelihood method. Using several examples of solving classification problems, we compare the performance of the methods by the criterion of empirical risk. The results are in favor of the method of approximating the Anderson discriminant function in the neighborhood of zero values.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.