Abstract
Automatic crop mapping is essential for various agricultural applications. Although fully convolutional networks (FCNs) have shown effectiveness in crop mapping, they rely on labor-intensive pixel-level annotations. Weakly supervised semantic segmentation (WSSS) methods offer a solution by enabling pixel-level segmentation with less costly annotations, such as point, bounding box (bbox), and image-level annotations. However, these weak annotations often lack complete object information, leading to reduced accuracy. Moreover, there is limited exploration of WSSS methods in medium-resolution satellite imagery, such as Sentinel-2. Therefore, this study proposes SAMWS, a WSSS method based on the Segment Anything Model (SAM) for crop mapping using Sentinel-2 time series images. Leveraging SAM, our method can generate high-quality pseudo labels and accommodate various weak annotations without extensive adjustments. SAMWS comprises three stages: 1) finetuning SAM with adapters; 2) generating pseudo labels using weak annotations and finetuned SAM; and 3) training a segmentation network using pseudo labels for crop mapping. Experiments conducted on PASTIS and Munich datasets demonstrate the superiority of our approach. After SAM was finetuned with adapters, the F1-scores for parcel segmentation using point and bbox prompts increased by 75.0 % and 14.4 %, respectively, reaching 0.880 and 0.931. Additionally, our method achieves classification results closest to fully supervised learning when using point, bbox, and image-level annotations, with F1-scores of 0.810, 0.817, and 0.779, respectively. Our approach offers valuable insights into leveraging foundation models in the remote sensing domain and contains significant potential for crop monitoring. The relevant code will be made publicly available at https://github.com/Nick0317Sun/SAMWS.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Applied Earth Observation and Geoinformation
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.