Abstract

Active learning (AL) for object detection (OD) aims to reduce labeling costs by selecting the most valuable samples that enhance the detection network from the unlabeled pool. Due to the complexity of OD compared with image classification, more consideration should be given when designing the selection strategies. Previous works have studied aggregating information of multiple outputs (especially the location information) and aggregating information of batch boxes, all of which indicate improved performances. However, the evaluation index—mean average precision (mAP) has not been considered seriously, although improving it is the goal of AL. Moreover, the background class is far more than other classes (15:1 or more) in each batch of samples, leading to a class imbalance problem. Therefore, AL strategies for OD, which take mAP and class imbalance in batch into consideration, may perform better. In this paper, WBetGS is proposed, which not only considers aggregating information of multiple outputs and batch boxes but also aims to mAP improvement and to address the class imbalance in batch. A weighted algorithm is introduced to promote the mAP more effectively. Besides, WBetGS eliminates the impact of class imbalance between background and object categories by extracting class-balanced information. Moreover, a diversity and uncertainty based sampling algorithm is introduced for batch mode active learning in object detection. The experimental results demonstrate that our method performs better than basic methods, saving up 100% of the labeling efforts while reaching the same performance in an actual industrial application.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.