Abstract

Due to the fact that it is expensive and time-consuming to annotate a large amount of data, the available labeled data to train a deep neural network is usually scarce, resulting in the poor performance of the deep classification network in most cases. In this paper, we propose a novel active learning method to train a competitive deep classification neural network by labeling a limited amount of diverse images. The proposed method advances most of the active learning methods in two aspects. First, active learning is enhanced with a co-auxiliary learning strategy. We use an auxiliary network to provide diverse pseudo-labels for the primary network. The auxiliary network is also adopted to remove a part of redundancy information from the candidate pool of active learning when image predictions of the primary network and the auxiliary network are the same. Meanwhile, the primary network can also provide pseudo-labels to improve the performance of the auxiliary network. Second, we further remove the redundancy within the query samples with multi-level diversity selection. The multi-level diversity strategy not only considers the feature-level diversity but also the class-level diversity with the predictions of the primary network, and it can select both representative and uncertain samples effectively and efficiently from the large-scale data. The extensive experiments on four large-scale datasets show that the proposed method outperforms the state-of-the-art methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.