Abstract

To solve practical problems, including for the management of territories, monitoring of emergencies, for the development of the ecology of our planet, for the study and ecological research of our planet, effective algorithms for the segmentation of multispectral images are needed. In recent years, image segmentation using convolutional neural networks has become very popular. A significant difference of this approach is the annotation of the assignment of each pixel to a particular class of objects, so that the learning process of such networks is completely controlled. The paper proposes a new method for segmentation of aerospace im-ages of high spatial resolution based on convolutional neural networks and mask generation. Our model is based on a combined U-Net network with MobileNetV2 as the backbone. It trains on ground data and provides a full prediction mask. A network has been implemented, consisting of separate networks of the same class, working on the expansion of segmentation. Semantic features are used to reduce errors at the semantic level of the outline. Since an in-dependent test revealed some shortcomings in certain classes of the earth's surface, further re-search will be devoted to the creation of a catalog of training samples to support successful differentiation of objects. In general, the obtained accuracy estimates demonstrate the modern characteristics of the developed model, as well as the effectiveness of this combination of network and datasets for test regions. The results show that the proposed algorithm can effec-tively improve the overall accuracy of the semantic segmentation of high spatial resolution remote sensing images and reduce the training time and segmentation time.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.