Abstract

Version space, defined as the subset of the hypothesis space consistent with the training samples, is an important concept in supervised learning. It has been successfully applied for evaluating the informativeness of unlabeled samples in traditional single-label active learning. Specifically, the most inconsistent samples among the version space members can reduce the size of the version space as fast as possible, these samples are given high priority for domain expert annotation, thereby the learner can construct a high-performance classifier by labeling as few samples as possible. We point out that the concept of version space has not been extended to multi-label environments yet, which hinders its application in multi-label active learning. This paper makes an attempt to extend the version space theory from single-label scenario to multi-label scenario, builds up a spatial structure for the multi-label version space, generalizes it from finite case to infinite case, puts forward a simplified representation for it and accordingly proposes a new multi-label active learning algorithm. Moreover, considering the imbalance issue in multi-label data, the algorithm is further improved by allocating different annotation numbers to the labels. Experimental comparisons verify the feasibility and effectiveness of the proposed methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.