Abstract
Multi-label learning deals with data associated with different labels simultaneously. Like traditional single-label learning, multi-label learning suffers from the curse of dimensionality as well. Feature selection is an efficient technique to improve learning efficiency with high-dimensional data. With the least square regression model, we incorporate feature manifold learning and sparse regularization into a joint framework for multi-label feature selection problems. The graph regularization is used to explore the feature geometric structure for gaining a better regression coefficient matrix which reflects the importance of varying features. Besides, the $$\ell _{2,1}$$ -norm is imposed on the sparsity term to guarantee the sparsity of the regression coefficients. Furthermore, we design an iterative updating algorithm with proved convergence to tackle the aforementioned formulated problem. The proposed method is validated in six publicly available data sets from real-world applications. Finally, extensively experimental results demonstrate its superiority over the compared state-of-the-art multi-label feature selection methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Machine Learning and Cybernetics
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.