Abstract

Object detectors are central to autonomous driving and are widely used in driver assistance systems. Object detectors are trained on a finite amount of data within a specific domain, hampering detection performance when applying object detectors to samples from other domains during inference, an effect known as domain gap. Domain gap is a concern for data-driven applications, evoking repetitive retraining of networks when the applications unfold into other domains. With object detectors that have been trained on day images only, a domain gap can be observed in object detection by night. Training object detectors on night images is critical because of the enormous effort required to generate an adequate amount of diversely labeled data, and existing data sets often tend to overfit specific domain characteristics. For the first time, this work proposes adapting domains by online image-to-image translation to expand an object detector's domain of operation. The domain gap is decreased without additional labeling effort and without having to retrain the object detector while unfolding into the target domain. The approach follows the concept of domain adaptation, shifting the target domain samples into the domain knownto the object detector (source domain). Firstly, the UNIT network is trained for domain adaptation and subsequently cast into an online domain adaptation module, which narrows down the domain gap. Domain adaptation capabilities are evaluated qualitatively by displaying translated samples and visualizing the domain shift through the 2D tSNE algorithm. We quantitatively benchmark the domain adaptation's influence on a state-of-the-art object detector, and on a retrained object detector, for mean average precision, mean recall, and the resulting F1-score. Our approach achieves an F1 score improvement of 5.27 % within object detection by night when applying online domain adaptation. The evaluation is executed on the BDD100K benchmark data set.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.