Abstract

Deep learning (DL) models achieved a lot of success due to the availability of labeled training data. In contrast, labeling a huge amount of data by a human is a time-consuming and expensive solution. Active Learning (AL) efficiently addresses the issue of labeled data collection at a low cost by picking the most useful samples from a large number of unlabeled datasets. However, current AL techniques largely depend on regular human involvement to annotate the most uncertain/informative samples in the collection. Therefore, we introduce a Deep Learning Based Active Learning (DLBAL) method that may incrementally learn from a small number of annotated training samples to build an effective classifier with optimum feature representation. The proposed DLBAL method improves the contemporary AL approaches in two steps. First, EfficientNet-B0 networks are integrated into AL. The incrementally annotated useful samples may be used to update both the representation of the feature and the classification algorithm concurrently. Second, we provide an effective technique for sample selection that improves the performance of the classifier with fewer manual annotations at a lower cost. In contrast to previous approaches that emphasize only uncertain instances with low prediction confidence, we obtain a significant number of instances with high-confidence from the unannotated set for pattern learning. Particularly, these instances are randomly picked and assigned labels to them continuously. Detailed experiments are conducted to show that the proposed DLBAL system obtained reasonable results on two difficult image classification datasets, namely Cross-Age Celebrity face recognition Dataset (CACD) and Caltech-256.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.