Abstract

Multi-label feature selection corresponds to pattern recognition and knowledge mining, and its application has been expanded to different scenarios. As an excellent processing platform for uncertain and ambiguous information, divergence-based fuzzy rough sets (Div-FRSs) have been proposed and applied to feature selection. However, there are three critical problems to be solved when applying Div-FRSs to multi-label learning. The first is how to effectively dispose the noise produced by features in multi-label data. The second is how to synthetically consider the relevance among all labels. The last is how to thoroughly mine the uncertainty brought by upper approximations neglected in Div-FRSs existing researches. To address these issues, this study presents a new divergence-based fuzzy neighborhood rough set model (Div-FNRSs) for multi-label learning using self-information. First, the divergence-based fuzzy neighborhood relation and class are gradually raised to manage the noise in multi-label data, and fuzzy decision is introduced to dispose all labels as a whole. Combining them together, a new model Div-FNRSs is constructed. Then, divergence-based fuzzy neighborhood self-information containing upper approximations and lower approximations is designed to depict distinguishing ability of features through three-level uncertainty measure establishment and granulation property exploration. Furthermore, feature significance for choosing the optimal features is given and it motivates a heuristic feature-selection algorithm DivFNSI-FS. Finally, data experiments are completed to validate DivFNSI-FS effectiveness with six state-of-the-art multi-label feature selection approaches on fourteen multi-label datasets. A conclusion can be drawn that DivFNSI-FS outperforms existing algorithms to obtain better performance on eight commonly-used evaluation indexes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.