Abstract

Large-scale mapping of apple orchards through remote sensing is of great significance for apple production management and the sustainable development of the apple industry. The flexible unmanned aerial vehicle (UAV) data and a wide swath of Sentinel-2 (S2) data provided the opportunity to map apple orchards accurately and in a timely manner over a large area. In order to fully combine the advantages of these data and realize the accurate monitoring of apple orchards, this study proposed a semantic segmentation method based on Cycle-Consistent Generative Adversarial Networks (CycleGAN) and transfer learning model (Trans_GAN). First, semantic segmentation models (Fully Convolutional Networks, U-Net, SegNet, DeepLabv3+) were compared. The model with the best performance on both S2 and UAV was selected as the optimal apple orchard recognition model. Second, to solve the problem of domain differences between S2 and UAV, CycleGAN was introduced to convert UAV images into the style of S2 images (Fake S2, F_S2). Finally, this study introduced the transfer learning method and used the F_S2 to assist S2 images to complete the task of extracting large-scale apple planting. Trans_GAN was tested in Zibo and Yantai. The results showed the ability of SegNet to refine the segmentation results and, as a result, achieve the highest extraction accuracy on both UAV and S2 images. The proposed method produced results as high as 20.93% in recall and 15.93% in F1 when compared to the SegNet method based on S2 without image-to-image translation and transfer learning. Therefore, the Trans_GAN method opens a new window for large-scale remote sensing apple orchard mapping.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call