Abstract
<p>Conveying uncertainty in model predictions is essential, especially when these predictions are used for decision-making. Models are not only expected to achieve the best possible fit to available calibration data but to also capture future observations within realistic uncertainty intervals. Model calibration using Bayesian inference facilitates the tuning of model parameters based on existing observations, while accounting for uncertainties. The model is tested against observed data through the likelihood function which defines the probability of the data being generated by the given model and its parameters. Inference of most plausible parameter values is influenced by the method used to combine likelihood values from different observation data sets. In the classical method of combining likelihood values, referred to here as the <em>AND calibration strategy</em>, it is inherently assumed that the given model is true (error-free), and that observations in different data sets are similarly informative for the inference problem. However, practically every model applied to real-world case studies suffers from model-structural errors that are typically dynamic, i.e., they vary over time. A requirement for the imperfect model to fit all data sets simultaneously will inevitably lead to an underestimation of uncertainty due to a collapse of the resulting posterior parameter distributions. Additionally, biased 'compromise solutions' to the parameter estimation problem result in large prediction errors that impair subsequent conclusions. <br>    <br>We present an alternative <em>AND/OR calibration strategy</em> which provides a formal framework to relax posterior predictive intervals and minimize posterior collapse by incorporating knowledge about similarities and differences between data sets. As a case study, we applied this approach to calibrate a plant phenology model (SPASS) to observations of the silage maize crop grown at five sites in southwestern Germany between 2010 and 2016. We compared model predictions of phenology on using the classical AND calibration strategy with those from two scenarios (OR and ANDOR) in the AND/OR strategy of combining likelihoods from the different data sets. The OR scenario represents an extreme contrast to the AND strategy as all data sets are assumed to be distinct, and the model is allowed to find individual good fits to each period adjusting to the individual type and strength of model error. The ANDOR scenario acts as an intermediate solution between the two extremes by accounting for known similarities and differences between data sets, and hence grouping them according to anticipated type and strength of model error. <br>    <br>We found that the OR scenario led to lower precision but higher accuracy of prediction results as compared to the classical AND calibration. The ANDOR scenario led to higher accuracy as compared to the AND strategy and higher precision as compared to the OR scenario. Our proposed approach has the potential to improve the prediction capability of dynamic models in general, by considering the effect of model error when calibrating to different data sets.</p>
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.