Abstract

Simultaneous extraction of spectral and spatial features and their fusion is currently a popular solution in hyperspectral image (HSI) classification. It has achieved satisfactory results in some research. Because the scales of objects are often different in HSI, it is necessary to extract multi-scale features. However, this aspect was not taken into account in many spectral-spatial feature fusion methods. This causes the model to be unable to get sufficient features on scales with a large difference range. The model (MCMN: Multi-Content Merging Network) proposed in this paper designs a multi-branch fusion structure to extract multi-scale spatial features by using multiple dilated convolution kernels. Considering the interference of the surrounding heterogeneous objects, the useful information from different directions is also fused together to realize the merging of multiple regional features. MCMN introduces a convolution block attention mechanism, which fully extracts attention features in both spatial and spectral directions, so that the network can focus on more useful parts, which can effectively improve the performance of the model. In addition, since the number of objects in each class is often discrepant, it will have some impact on the training process. We apply the focal loss function to eliminate the negative factor. The experimental results of MCMN on three data sets have a breakthrough compared with the other comparison models, which highlights the role of MCMN structure.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.