Abstract
Target region segmentation of synthetic aperture radar (SAR) images is one of the challenging problems in SAR image interpretation. The existing conventional segmentation methods rely on parameter selection in different backgrounds. Compared with traditional methods, the deep-learning-based methods can reduce the dependency on parameters and achieve more accurate results. However, lacking annotation data limits the application of the deep-learning-based methods in SAR chip image segmentation aspect. To solve these problems, a refined network structure for SAR vehicle image semantic segmentation, namely, All-Convolutional networks (A-ConvNets)-based Mask (ACM) net, is proposed. The mask in the training dataset of the network is extracted from image reconstruction using the Attribute Scattering Center (ASC) model, which can solve the problem of the lack of manual annotation in the segmentation methods based on deep learning. The proposed ACM Net consists of a modified A-ConvNets-based backbone and two decoupled head branches which achieve target segmentation and label prediction results, respectively. Experiments on moving and stationary target acquisition and recognition (MSTAR) dataset show that the comprehensive segmentation performance of ACM Net is better than both traditional segmentation methods and deep-learning-based segmentation methods. The classification results outperform other instance or semantic segmentation methods with the state-of-the-art recognition accuracy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.