Maximizing crop production efficiently and sustainably through plant health monitoring is key for global food security. Monitoring large areas with remote sensing technologies such as unmanned aerial vehicles (UAVs) with sensors deals with time and money issues; however, the usage of advanced sensors such as hyperspectral, multispectral and thermal cameras limit their usage among all the stakeholders. In this study we explore different vegetation indices (VIs) extracted from aerial RGB images acquired in different flights to differentiate the nutritional and water statuses of Hass avocado plantations. We used an image processing workflow consisting of image selection through a convolutional neural network (CNN) model, tree crown segmentation, color correction and feature extraction to automate the computation of VIs from RGB images. To compare the performance of VIs in the differentiation of nutritional and water statuses, we proposed a comparison metric called Mean Distance between Vegetation Indices (MDVI), analyzed the evolution of the extracted features, and studied their relationships with gold standard Normalized Difference Vegetation Index (NDVI) measurements. Since the extracted features from each group vary from flight to flight due to multiple factors such as the light intensity of each season and the phenological stage of the plant, the proposed comparison metric leverages the differences between the features extracted from each group, thus reducing these temporal effects. We found that Modified Green Red Vegetation Index (MGRVI) allows a better differentiation of nutritional and water statuses. Furthermore, the correlation coefficients of this VI in the three statuses and NDVI for nitrogen group range between 0.63 and 0.85, indicating a positive strong relationship. The results of this work show that MGRVI has a potential to be used as a correlation variable in studies that only use RGB sensors in order to monitor the nutritional and water status of crops.
Read full abstract