Abstract

In function approximation, if datasets have many redundant input variables, various problems such as deterioration of the generalization ability and an increase of the computational cost may occur. One of the methods to solve these problems is variable selection. In pattern recognition, the effectiveness of backward variable selection by block deletion is shown. In this paper, we extend this method to function approximation. To prevent the deterioration of the generalization ability, we use the approximation error of a validation set as the selection criterion. And to reduce computational cost, during variable selection we only optimize the margin parameter by cross-validation. If block deletion fails we backtrack and start binary search for efficient variable selection. By computer experiments using some datasets, we show that our method has performance comparable with that of the conventional method and can reduce computational cost greatly. We also show that a set of input variables selected by LS-SVRs can be used for SVRs without deteriorating the generalization ability.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call