Abstract

The object detection task usually assumes that the training and test samples obey the same distribution, and this assumption is not valid in reality, therefore the study of cross-domain object detection is proposed. Compared with image classification, the cross-domain object detection task presents the greater challenge, which requires both accurate classification and localization of samples in the target domain. The teacher-student framework (the student model is supervised by pseudo-labels from the teacher model) has produced a large accuracy improvement in cross-domain object detection. Feature-level adversarial training is used in the student model, which allows features in the source and target domains to share a similar distribution. However, the direction and gradient of the weights can be divided into domain-specific and domain-invariant features, and the purpose of domain adaptive is to focus on the domain-invariant features while eliminating interference from the domain-specific features. Inspired by this, we propose a teacher-student framework named dual adaptive branch (DAB), which uses domain adversarial learning to address the domain distribution. Specifically, we ensure that the student model aligns domain-invariant features and suppresses domain-specific features in this process. We further validate our method based on multiple domains. The experimental results demonstrate that our proposed method significantly improves the performance of cross-domain object detection and achieves the competitive experimental results on common benchmarks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.