Abstract
Maize (Zea mays L.) has been shown to be sensitive to temperature deviations, influencing its yield potential. The development of new maize hybrids resilient to unfavourable weather is a desirable aim for crop breeders. In this paper, we showcase the development of a multimodal deep learning model using RGB images, phenotypic, and weather data under temporal effects to predict the yield potential of maize before or during anthesis and silking stages. The main objective of this study was to assess if the inclusion of historical weather data, maize growth captured through imagery, and important phenotypic traits would improve the predictive power of an established multimodal deep learning model. Evaluation of the model performance when training from scratch showed its ability to accurately predict ~89% of hybrids with high-yield potential and demonstrated enhanced explanatory power compared with previously published models. Shapley Additive explanations (SHAP) analysis indicated the top influential features include plant density, hybrid placement in the field, date to anthesis, parental line, temperature, humidity, and solar radiation. Including weather historical data was important for model performance, significantly enhancing the predictive and explanatory power of the model. For future research, the use of the model can move beyond maize yield prediction by fine-tuning the model on other crop data, serving as a potential decision-making tool for crop breeders to determine high-performing individuals from diverse crop types.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.