Abstract

Unsupervised domain adaptive semantic segmentation uses the knowledge learned from the labeled source domain dataset to guide the segmentation of the target domain. However, this domain migration method will cause a large inter-domain difference due to the different feature distributions between the source domain and the target domain. We use the self-supervised learning method to generate pseudo labels for the target domain, so that the corresponding pixels are directly aligned with the source domain according to the segmentation loss. Through observation, it is found that the spatial distribution of the background class in the source domain and the target domain has a small difference, and the appearance of the same class of the foreground class will also be quite different. We use the method of distinguishing alignment between foreground and background classes. We understand that acquiring the rich space and channel information on the feature map during the convolution process is essential for fine-grained semantic segmentation. Therefore, in order to obtain the dependency relationship between the channels of the feature map and the spatial position information, we use a channel and spatial parallel attention module. This module enables the network to select and amplify valuable space and channel information from the global information and suppress useless information. In addition, we introduce focal loss to solve the problem of class imbalance in the data set. Experiments show that our method achieves better segmentation performance in unsupervised domain adaptive semantic segmentation. • We use self-supervised learning to generate pseudo-labels for the target domain. • We distinguish and align the foreground and background classes. • We use parallel attention module to capture the space and channel information. • We add focal loss to the overall loss to reduce the impact of class imbalance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.