Abstract
Abstract Machine learning algorithms are able to capture complex, nonlinear, interacting relationships and are increasingly used to predict agricultural yield variability at regional and national scales. Using explainable artificial intelligence (XAI) methods applied to such algorithms may enable better scientific understanding of drivers of yield variability. However, XAI methods may provide misleading results when applied to spatiotemporal correlated datasets. In this study, machine learning models are trained to predict simulated crop yield from climate indices, and the impact of cross-validation strategy on the interpretation and performance of the resulting models is assessed. Using data from a process-based crop model allows us to then comment on the plausibility of the “explanations” provided by XAI methods. Our results show that the choice of evaluation strategy has an impact on (i) interpretations of the model and (ii) model skill on held-out years and regions, after the evaluation strategy is used for hyperparameter tuning and feature selection. We find that use of a cross-validation strategy based on clustering in feature space achieves the most plausible interpretations as well as the best model performance on held-out years and regions. Our results provide the first steps toward identifying domain-specific “best practices” for the use of XAI tools on spatiotemporal agricultural or climatic data. Significance Statement “Explainable” or “interpretable” machine learning (XAI) methods have been increasingly used in scientific research to study complex relationships between climatic and biogeoscientific variables (such as crop yield). However, these methods can return contradictory, implausible, or ambiguous results. In this study, we train machine learning models to predict maize yield anomalies and vary the model evaluation method used. We find that the evaluation (cross validation) method used has an effect on model interpretation results and on the skill of resulting models in held-out years and regions. These results have implications for the methodological design of studies that aim to use XAI tools to identify drivers of, for example, crop yield variability.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.