Abstract
Obtaining accurate cultivated land distribution data is crucial for sustainable agricultural development. The current cultivated land extraction studies mainly analyze crops on a regular shape and a small block scale. Aiming at the problem of fragmentation of plots in complexly shaped cultivated land leads to variable scales and blurred edges and the difficulty of extracting the context information by kernel convolution operation of the CNN-based model. We propose a complexly shaped farmland extraction network considering multi-scale features and edge priors (MFEPNet). Specifically, we design a context cross-attention fusion module to couple the local-global features extracted by the two-terminal path CNN-transformer network, which obtains more accurate cultivated land plot representations. This paper constructs the relation maps through a multi-scale feature reconstruction module to realize multi-scale information compensates by combining the gated weight parameter based on information entropy. Additionally, we design a texture-enhanced edge module, which uses the attention mechanism to fuse the edge information of texture feature extraction and the reconstructed feature map to enhance the edge features. In general, the network effectively reduces the influence of variable scale, blurred edges, and limited global field of view. The novel model proposed in this paper is compared with classical deep learning models such as UNet, DeeplabV3 +, DANet, PSPNet, RefineNet, SegNet, ACFNet, and OCRNet on the regular and irregular farmland datasets divided by IFLYTEK and Netherlands datasets. The experimental results show that MFEPNet achieves 92.40 and 91.65 MIoU on regular and irregular farmland datasets, which is better than the benchmark experimental model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.