Abstract

For a small sample problem with a large number of features, feature selection by cross-validation frequently goes into random tie breaking because of the discrete recognition rate. This leads to inferior feature selection results. To solve this problem, we propose using a least squares support vector regressor (LS SVR), instead of an LS support vector machine (LS SVM). We consider the labels (1/-1) as the targets of the LS SVR and the mean absolute error by cross-validation as the selection criterion. By the use of the LS SVR, the selection and ranking criteria become continuous and thus tie breaking becomes rare. For evaluation, we use incremental block addition and block deletion of features that is developed for function approximation. By computer experiments, we show that performance of the proposed method is comparable with that with the criterion based on the weighted sum of the recognition error rate and the average margin error.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call