Abstract

Semantic segmentation is an important field for automatic processing of remote sensing image data. Existing algorithms based on Convolution Neural Network (CNN) have made rapid progress, especially the Fully Convolution Network (FCN). However, problems still exist when directly inputting remote sensing images to FCN because the segmentation result of FCN is not fine enough, and it lacks guidance for prior knowledge. To obtain more accurate segmentation results, this paper introduces edge information as prior knowledge into FCN to revise the segmentation results. Specifically, the Edge-FCN network is proposed in this paper, which uses the edge information detected by Holistically Nested Edge Detection (HED) network to correct the FCN segmentation results. The experiment results on ESAR dataset and GID dataset demonstrate the validity of Edge-FCN.

Highlights

  • This paper explores the important role of edge information in remote sensing image semantic segmentation

  • A network Edge-Fully Convolution Network (FCN) that introduces edge information into semantic segmentation is proposed in this paper

  • It is divided into Cascade-Edge-FCN and Correct-Edge-FCN according to the way of introducing edge information

Read more

Summary

Introduction

Remote sensing images may include a variety of geomorphological information, such as roads, arable land, and buildings. To classify this different geomorphological information is of great significance for topographic surveys and military analysis. To finish the classification task, each pixel in a remote sensing image should be assigned to a label associated with a terrain category, which is consistent with image semantic segmentation. A lot of creative algorithms such as FCN [4], Deeplabs [5,6,7], CRF as RNN [8] etc., have made surprising results. Convolution Network (FCN) is the first end-to-end network for semantic segmentation, which creatively introduces deconvolution. CRF as RNN [8] makes the CRF integrate into the segmentation network to form an end-to-end network

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.