Abstract

The objective of Bayesian inference is often to infer, from data, a probability measure for a random variable that can be used as input for Monte Carlo simulation. When datasets for Bayesian inference are small, a principle challenge is that, as additional data are collected, the probability measure inferred from Bayesian inference may change significantly. In such cases, expensive Monte Carlo simulations may have already been performed using the original distribution and it is infeasible to start again and perform a new Monte Carlo analysis using the updated density due to the large added computational cost. This work explores four strategies for updating Monte Carlo simulations for such a change in probability measure. The efficiency of each strategy is compared and the ultimate aim is to achieve the change in distribution with a minimal number of added computational simulations. The results show that, when the change in measure is small, importance sampling reweighting can be very effective. Otherwise, a proposed mixed augmenting-filtering algorithm can robustly and efficiently accommodate a measure change in Monte Carlo simulation. The strategy is then applied for uncertainty quantification in the buckling strength of a simple plate given ongoing data collection to estimate uncertainty in the yield stress.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.