Abstract

High spatial resolution remote sensing image (HSRRSI) data provide rich texture, geometric structure, and spatial distribution information for surface water bodies. The rich detail information provides better representation of the internal components of each object category and better reflects the relationships between adjacent objects. In this context, recognition methods such as geographic object-based image analysis (GEOBIA) have improved significantly. However, these methods focus mainly on bottom-up classifications from visual features to semantic categories, but ignore top-down feedback which can optimize recognition results. In recent years, deep learning has been applied in the field of remote sensing measurements because of its powerful feature extraction ability. A special convolutional neural network (CNN) based region proposal generation and object detection integrated framework has greatly improved the performance of object detection for HSRRSI, which provides a new method for water body recognition based on remote sensing data. This study uses the excellent “self-learning ability” of deep learning to construct a modified structure of the Mask R-CNN method which integrates bottom-up and top-down processes for water recognition. Compared with traditional methods, our method is completely data-driven without prior knowledge, and it can be regarded as a novel technical procedure for water body recognition in practical engineering application. Experimental results indicate that the method produces accurate recognition results for multi-source and multi-temporal water bodies, and can effectively avoid confusion with shadows and other ground features.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.