Abstract
Recently, a wide range of research on object detection has shown breakthrough performance. However, in a challenging environment, such as occlusion and small object cases, object detectors still produce inaccurate or erroneous predictions. To effectively cope with such conditions, most of the existing methods have suggested loss functions to guide the object detectors by modulating the magnitude of their loss. However, when modulating the loss function, they are highly dependent on the classification score of the object detector. It is a known fact that deep neural networks tend to be overconfident in their predictions. In this article, to alleviate the problem of the object detectors which heavily rely on the prediction in the training phase, we devise a novel loss function called class uncertainty-aware (CUA) loss. CUA loss considers the predictive ambiguity as well as the predictions on classification score when modulating loss function. In addition to the classification score, CUA loss further modulates the loss gradient in an increasing way when the object detectors output an uncertain prediction. Therefore, object detectors with CUA loss effectively cope with challenging environments where prediction results are uncertain. With comprehensive experiments on three public datasets (i.e. PASCAL VOC, MS COCO, and Berkeley DeepDrive), we verified that our CUA loss enhanced the accuracy of the object detectors and outperformed previous state-of-the-art loss functions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Circuits and Systems for Video Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.