Abstract

Object detection is one of the main tasks in computer vision and has made great progress in recent years. However, the performance of target detectors is significantly dropped by the differences between existing datasets and application scenarios, leading to the so-called domain shift problem. To address such an issue, a novel co-teaching based pseudo label refinery framework for cross-domain object detection is developed, which cooperates with two models to select data from target domain for each other. This strategy can effectively purify the predicted pseudo labels and resist noisy labels. Specifically, the framework consists of two encoders (i.e. structure encoder and global encoder), two classifiers and one discriminator, in which structure encoder is used to extract structural features that are not disturbed by colour, and the global encoder is used to extract the complete discriminant features. The two encoders are each followed by a classifier. In training, the structure and global encoder with labelled source samples are first trained, so that it has the initial recognition ability. Then the samples assigned are used with pseudo labels by the classifier following the structure encoder to fine-tune the global encoder which pre-trained on the labelled source domain and obtain the refined labels for the target data. With the refined labels, the structure encoder is further optimised on the target domain. During this process, the proposal is to cross use the two classifiers to promote the mutual transfer of complementary capabilities of the two encoders. Moreover, a novel residual channel attention block (RCA) embedded with salient features is designed to pay more attention to the target regions. Extensive experiments demonstrate that the developed framework can generate clean labels for unlabelled target data and boost the performance of cross domain object detection. The code is available at http://www.msp-lab.cn:1436/msp/cbplr-master.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.