Maize (Zea mays L.) has been shown to be sensitive to temperature deviations, influencing its yield potential. The development of new maize hybrids resilient to unfavourable weather is a desirable aim for crop breeders. In this paper, we showcase the development of a multimodal deep learning model using RGB images, phenotypic, and weather data under temporal effects to predict the yield potential of maize before or during anthesis and silking stages. The main objective of this study was to assess if the inclusion of historical weather data, maize growth captured through imagery, and important phenotypic traits would improve the predictive power of an established multimodal deep learning model. Evaluation of the model performance when training from scratch showed its ability to accurately predict ~89% of hybrids with high-yield potential and demonstrated enhanced explanatory power compared with previously published models. Shapley Additive explanations (SHAP) analysis indicated the top influential features include plant density, hybrid placement in the field, date to anthesis, parental line, temperature, humidity, and solar radiation. Including weather historical data was important for model performance, significantly enhancing the predictive and explanatory power of the model. For future research, the use of the model can move beyond maize yield prediction by fine-tuning the model on other crop data, serving as a potential decision-making tool for crop breeders to determine high-performing individuals from diverse crop types.
Read full abstract