Abstract

Extraction of cultivated land information from high spatial resolution remote sensing images is increasingly becoming an important approach to digitization and informatization in modern agriculture. The continuous development of deep learning technology has made it possible to extract information of cultivated land parcels by an intelligent way. Aiming at fine extraction of cultivated land parcels within large areas, this article builds a framework of geographical thematic scene division according to the rule of territorial differentiation in geography. A deep learning semantic segmentation network, improved U-net with depthwise separable convolution (DSCUnet), is proposed to achieve the division of the whole image. Then, an extended multichannel richer convolutional features (RCF) network is involved to delineate the boundaries of cultivated land parcels from agricultural functional scenes obtained by the former step. In order to testify the feasibility and effectiveness of the proposed methods, this article implemented experiments using Gaofen-2 images with different spatial resolution. The results show an outstanding performance using methods proposed in this article in both dividing agricultural functional scenes and delineating cultivated land parcels compared with other commonly used methods. Meanwhile, the extraction results have the highest accuracy in both the traditional evaluation indices (like Precision, Recall, F1, and IoU) and geometric boundary precision of cultivated land parcels. The methods in this article can provide a feasible solution to the problem of finely extracting cultivated land parcels information within large areas and complex landscape conditions in practical applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.