Abstract

AbstractMulti-label classification has been increasingly recognized since it can assign multiple class labels to an object. This paper proposes a new method to solve simultaneously two major problems in multi-label classification; (1) requirement of sufficient labeled data for training and (2) high dimensionality in feature/label spaces. Towards the first issue, we extend semi-supervised learning to handle multi-label classification and then exploit unlabeled data with averagely high-confident tagged labels as additional training data. To solve the second issue, we present two alternative dimensionality-reduction approaches using Singular Value Decomposition (SVD). The first approach, namely LAbel Space Transformation for CO-training REgressor (LAST-CORE), reduces complexity in the label space while the second one namely Feature and LAbel Space Transformation for CO-training REgressor (FLAST-CORE), compress both label and feature spaces. For both approaches, the co-training regression method is used to predict the values in the lower-dimensional spaces and then the original space can be reconstructed using the orthogonal property of SVD with adaptive threshold setting. Additionally, we also introduce a method of parallel computation to fasten the co-training regression. By a set of experiments on three real world datasets, the results show that our semi-supervised learning methods gain better performance, compared to the method that uses only the labeled data. Moreover, for dimensionality reduction, the LAST-CORE approach tends to obtain better classification performance while the FLAST-CORE approach helps saving computational time.Keywordsmulti-label classificationSingular Value DecompositionSVDdimensionality reductionsemi-supervised learningco-training

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.