Abstract
Domain generalization (DG) is a challenging task that aims to train a robust model with only labeled source data and can generalize well on unseen target data. The domain gap between the source and target data may degrade the performance. A plethora of methods resort to obtaining domain-invariant features to overcome the difficulties. However, these methods require sophisticated network designs or training strategies, causing inefficiency and complexity. In this paper, we first analyze and reclassify the features into two categories, i.e., implicitly disentangled ones and explicitly disentangled counterparts. Since we aim to design a generic algorithm for DG to alleviate the problems mentioned above, we focus more on the explicitly disentangled features due to their simplicity and interpretability. We find out that the shape features of images are simple and elegant choices based on our analysis. We extract the shape features from two aspects. In the aspect of networks, we propose Multi-Scale Amplitude Mixing (MSAM) to strengthen shape features at different layers of the network by Fourier transform. In the aspect of inputs, we propose a new data augmentation method called Random Shape Warping (RSW) to facilitate the model to concentrate more on the global structures of the objects. RSW randomly distorts the local parts of the images and keeps the global structures unchanged, which can further improve the robustness of the model. Our methods are simple yet efficient and can be conveniently used as plug-and-play modules. They can outperform state-of-the-art (SOTA) methods without bells and whistles.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Circuits and Systems for Video Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.