Abstract
Land image recognition and classification and land environment detection are important research fields in remote sensing applications. Because of the diversity and complexity of different tasks of land environment recognition and classification, it is difficult for researchers to use a single model to achieve the best performance in scene classification of multiple remote sensing land images. Therefore, to determine which model is the best for the current recognition classification tasks, it is often necessary to select and experiment with many different models. However, finding the optimal model is accompanied by an increase in trial-and-error costs and is a waste of researchers’ time, and it is often impossible to find the right model quickly. To address the issue of existing models being too large for easy selection, this paper proposes a multi-path reconfigurable network structure and takes the multi-path reconfigurable residual network (MR-ResNet) model as an example. The reconfigurable neural network model allows researchers to selectively choose the required modules and reassemble them to generate customized models by splitting the trained models and connecting them through modules with different properties. At the same time, by introducing the concept of a multi-path input network, the optimal path is selected by inputting different modules, which shortens the training time of the model and allows researchers to easily find the network model suitable for the current application scenario. A lot of training data, computational resources, and model parameter experience are saved. Three public datasets, NWPU-RESISC45, RSSCN7, and SIRI-WHU datasets, were used for the experiments. The experimental results demonstrate that the proposed model surpasses the classic residual network (ResNet) in terms of both parameters and performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.