Abstract

In a J class classification problem with data of the form: ( y n , x n ), n = 1,…, N, where y n ϵ{1,…, J} and x n = ( x 1 n ,…, x Mn ), linear discriminant analysis produces estimated class boundaries which are linear in x 1,…, x M . In this paper, a method is developed which estimates the conditional class probabilities in a function space which is bigger than the linear function space. The decision rule based on those estimated conditional class probabilities can have very nonlinear class boundaries. The method projects the conditional class probabilities onto a space spanned by cubic splines, and, hence, is called classification using splines (CUS). This new method seems to achieve comparable and in some cases lower misclassification error rates than existing methods like CART or the back-propagation neural network classifier.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.