Abstract

ABSTRACT Parcels are the basic unit of crop planting and management. Therefore, parcelwise farmland data become the fundamental basis for precision agriculture applications, and the extraction of parcels from high-resolution remote sensing images is of great importance. The deep learning-based edge detection methods have achieved superior performance, but these methods output edge intensity maps with pixel values from 0 to 255 in raster format. Vectorization, which transforms the rasterized data into vectors, is an important post-processing procedures. In this process, segmentation and thinning are two key steps for deriving the one-pixel-wide binary edge, however, the traditional method suffers deviation from the actual edge and the unclosed edge. To address these problems, based on the hypothesis that the larger the edge intensity value is, the greater the likelihood that the pixel is on or near the boundary, we developed a multilevel segmentation method for agricultural parcel extraction from a semantic boundary, which prioritizes using the pixels with high intensity to ensure that the extracted boundaries adhere closely to the actual boundaries and the pixels with low intensity connect the unclosed boundaries, thus simultaneously improving the fidelity and completeness of boundaries. We selected images acquired in Hangzhou Bay and Denmark to test our method, and the result demonstrates that our method can accurately extract agricultural parcels. Compared with the single threshold segmentation method, our method shows higher boundary fidelity and completeness. Compared with the state-of-the-art method, our method achieves competitive performance in traditional metrics but outperforms edge preservation and one-to-one correspondence between the extracted parcel and actual parcel.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.