Abstract

ABSTRACT Sea ice detection has played an important role in climate protection and strategic deployment. Due to the autonomous learning characteristics of deep learning, it has been gradually applied to the classification of remote sensing sea ice images. At present, deep learning models are mostly used in sea ice classification of single source remote sensing data. Due to the limitations of single source data and the information loss of deep learning model in the process of feature extraction layer by layer, it is inevitable to encounter a bottleneck in sea ice detection requiring fine classification. To solve the above problems, this paper proposes a sea ice image classification method based on ResNet16-feature pyramid networks-spatial pyramid pooling-gated fusion network (ResFPG) and heterogeneous data fusion. In the feature extraction part, the method uses the improved ResNet16 to extract the multi-level feature information of sea ice in synthetic aperture radar data and optical data, reducing information loss in the feature extraction process, then mines and fuses the low-level spatial information and high-level semantic information through the improved feature pyramid networks (FPN), and then collects and fuses the output features of different scales through the spatial pyramid pooling (SPP) network. In the feature fusion part, a gated feature-level fusion strategy is designed to further improve the overall classification accuracy by adaptively adjusting the feature contribution of two heterogeneous sources of data through the gated fusion network (GFN). In order to verify the effectiveness of this method, we use two sets of heterogeneous sea ice remote sensing data located in Hudson Bay area for experiments. The experimental results show that compared with other image classification methods, the proposed method fully excavates and integrates the multi-scale and multi-level features in heterogeneous data, effectively distinguishes the feature contribution, and achieves better classification results (97.14% and 95.85%).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.