Abstract

The recently proposed Relevance Sample-Feature Machine (RSFM) performs joint feature selection and classification with state-of-the-art performance in terms of accuracy and sparsity. However, it suffers from high computational cost for large training sets. To accelerate its training procedure, we introduce a new variant of this algorithm named Incremental Relevance Sample-Feature Machine (IRSFM). In IRSFM, the marginal likelihood maximization approach is changed such that the model learning follows a constructive procedure (starting with an empty model, it iteratively adds or omits basis functions to construct the learned model). Our extensive experiments on various data sets and comparison with various competing algorithms demonstrate the effectiveness of the proposed IRSFM in terms of accuracy, sparsity and run-time. While the IRSFM achieves almost the same classification accuracy as the RSFM, it benefits from sparser learned model both in sample and feature domains and much less training time than RSFM especially for large data sets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.