By using unsupervised domain adaptation (UDA), knowledge is transferred from a label-rich source domain to a target domain that contains relevant information but has no labels. Most existing UDA algorithms primarily align domain-invariant features are primarily aligned during training, whereas target-specific information is ignored when learning the domain-invariant features. To address this issue, we attempted to boost the performance of unsupervised domain adaptation using a Fourier approach (FUDA). Specifically. FUDA is inspired by the fact that the amplitude of the Fourier spectrum mainly primarily preserves low-level statistics. Thus, the source domain can be augmented in FUDA to effectively equip with some low-level information in a target domain by fusing the amplitude of the two domains in the Fourier domain. Meanwhile, we propose Fourier transform channel attention, which represents the weight of Fourier transform to capture feature diversity. On the basis of Fourier analysis, we further show that the conventional attention that is built upon global average pooling is a special case of our proposed attention. Our method is evaluated by using four domain adaptation benchmarks, such as Office-31, Office-Home, VisDA-2017 and DomainNet, demonstrating the effectiveness of our FUDA.