Abstract

In a J class classification problem with data of the form: ( y n , x n ), n = 1,…, N, where y n ϵ{1,…, J} and x n = ( x 1 n ,…, x Mn ), linear discriminant analysis produces estimated class boundaries which are linear in x 1,…, x M . In this paper, a method is developed which estimates the conditional class probabilities in a function space which is bigger than the linear function space. The decision rule based on those estimated conditional class probabilities can have very nonlinear class boundaries. The method projects the conditional class probabilities onto a space spanned by cubic splines, and, hence, is called classification using splines (CUS). This new method seems to achieve comparable and in some cases lower misclassification error rates than existing methods like CART or the back-propagation neural network classifier.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call