Abstract

In the era of big data, feature selection is indispensable as a dimensional reduction technique to lower data complexity and enhance machine learning performances. However, traditional feature selection methods mainly focus on classification performances, while they exclude the impact of associated feature costs; e.g., price, risk, and computational complexity for feature acquisition. In this research, we extend the ℓ1 norm support vector machine (ℓ1-SVM) to address the feature costs, by incorporating a budget constraint to preserve classification accuracy with the least expensive features. Furthermore, we formulate its robust counterpart to address the uncertainty of the feature costs. To enhance computational efficiency, we also develop an algorithm to tighten the bound of the weight vector in the budget constraint. Through the experimental study on a variety of benchmark and synthetic datasets, our proposed mixed integer linear programming (MILP) models show that they can achieve competitive outcomes in terms of predictive and economic performances. Also, the algorithm that tightens the budget constraint helps to curtail computational complexity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.