Abstract

Neural networks suffer performance degradation when the source and target data lie in a different distribution hampering direct deployment of the model to diverse target domains. To this end, domain generalization (DG) aims to generalize the model well to an unknown target domain by utilizing multiple source domains. This paper proposes two simple swapping mechanisms, texture and channel swapping (TCX), for DG. Texture swapping augments the source dataset by replacing textures of an image with other textures in source dataset to alleviate the texture bias problem in convolution neural networks (CNNs). Furthermore, channel swapping switches channels of input feature vectors of the classifier along with its labels to encourage the model to utilize more channels when classifying an image. Together, we expect the model to learn less domain-specific features but more generalized class-specific features resulting in better domain generalization performance. We demonstrate the effectiveness of our approach with state-of-the-art results on three domain generalization benchmarks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.