Abstract
Vegetable mapping from remote sensing imagery is important for precision agricultural activities such as automated pesticide spraying. Multi-temporal unmanned aerial vehicle (UAV) data has the merits of both very high spatial resolution and useful phenological information, which shows great potential for accurate vegetable classification, especially under complex and fragmented agricultural landscapes. In this study, an attention-based recurrent convolutional neural network (ARCNN) has been proposed for accurate vegetable mapping from multi-temporal UAV red-green-blue (RGB) imagery. The proposed model firstly utilizes a multi-scale deformable CNN to learn and extract rich spatial features from UAV data. Afterwards, the extracted features are fed into an attention-based recurrent neural network (RNN), from which the sequential dependency between multi-temporal features could be established. Finally, the aggregated spatial-temporal features are used to predict the vegetable category. Experimental results show that the proposed ARCNN yields a high performance with an overall accuracy of 92.80%. When compared with mono-temporal classification, the incorporation of multi-temporal UAV imagery could significantly boost the accuracy by 24.49% on average, which justifies the hypothesis that the low spectral resolution of RGB imagery could be compensated by the inclusion of multi-temporal observations. In addition, the attention-based RNN in this study outperforms other feature fusion methods such as feature-stacking. The deformable convolution operation also yields higher classification accuracy than that of a standard convolution unit. Results demonstrate that the ARCNN could provide an effective way for extracting and aggregating discriminative spatial-temporal features for vegetable mapping from multi-temporal UAV RGB imagery.
Highlights
Accurate vegetable mapping is of great significance for modern precision agriculture
Previous studies mainly focused on the staple crop classification [1], in this regard, we are highly motivated to propose an effective method for vegetable classification based on unmanned aerial vehicle (UAV) observations, which could provide a useful reference for future studies on vegetable mapping
We introduce multi-temporal UAV observations, which could obtain the phenological information during the growing season to increase the inter-class separability
Summary
Accurate vegetable mapping is of great significance for modern precision agriculture. It is of great importance to study the automatic methods for precise vegetable classification. Optical satellite imagery was firstly utilized for vegetable and crop mapping. Belgiu et al utilized multi-temporal Sentinel-2 imagery for crop mapping based on a time-weighted dynamic time warping method and achieved comparable accuracy with random forest (RF) [3]. As the new generation sensor of Landsat satellite, data acquired by the Operational Land Imager (OLI) from Landsat-8 has been used for vegetable and crop type classification. Asgarian et al used multi-date OLI imagery for vegetable and crop mapping in central Iran based on decision tree and SVM and achieved good results [6]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.