Abstract

By using unsupervised domain adaptation (UDA), knowledge is transferred from a label-rich source domain to a target domain that contains relevant information but has no labels. Most existing UDA algorithms primarily align domain-invariant features are primarily aligned during training, whereas target-specific information is ignored when learning the domain-invariant features. To address this issue, we attempted to boost the performance of unsupervised domain adaptation using a Fourier approach (FUDA). Specifically. FUDA is inspired by the fact that the amplitude of the Fourier spectrum mainly primarily preserves low-level statistics. Thus, the source domain can be augmented in FUDA to effectively equip with some low-level information in a target domain by fusing the amplitude of the two domains in the Fourier domain. Meanwhile, we propose Fourier transform channel attention, which represents the weight of Fourier transform to capture feature diversity. On the basis of Fourier analysis, we further show that the conventional attention that is built upon global average pooling is a special case of our proposed attention. Our method is evaluated by using four domain adaptation benchmarks, such as Office-31, Office-Home, VisDA-2017 and DomainNet, demonstrating the effectiveness of our FUDA.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.