Abstract

The nitrogen nutrition index (NNI) has been extensively applied for the diagnosis of crop nitrogen status, providing insights into efficient nitrogen utilization and plant growth. In this study, we utilized a low-altitude unmanned aerial vehicle (UAV) platform, equipped with multispectral (MS), red–green–blue (RGB), and thermal infrared (TIR) cameras, to comprehensively capture wheat spectral information. The analysis of the relationship between NNI and relative yield revealed an initially linear relationship, which saturated for high NNI values. To enhance accuracy and minimize complexity, we employed a random forest (RF) – recursive feature elimination (RFE) method to select features as inputs for four machine learning (ML) models: back propagation neural network (BPNN), extreme learning machine (ELM), support vector regression (SVR), and Gaussian process regression (GPR). After feature selection, the prediction accuracies of single-sensor models were ranked as: MS > RGB > TIR. The R2 values for the four ML models were in the range of 0.54–0.75. Among multi-sensor combinations, the GPR with MS + RGB + TIR input features achieved the best results with R2 = 0.89 and RPD = 2.52. Further, the dataset was partitioned into six subsets based on location and cultivar variety to evaluate model transferability. The results showed that the transferability largely suffered during the bivariate conditions of different varieties at different locations; the transferability of the model was average improved by 11 % when GPR was combined with transfer component analysis (TCA). The accuracy and transferability of the NNI estimation models significantly improved, offering valuable guidance and methodological support for diagnosing the nitrogen nutrient status of wheat.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call