Sort by
Confidence-Based Uncertainty Quantification and Model Validation for Simulations of High-Speed Impact Problems

Abstract Validation exercises for computational models of materials under impact must contend with sparse experimental data as well as with uncertainties due to microstructural stochasticity and variabilities in thermomechanical properties of the material. This paper develops statistical methods for determining confidence levels for verification and validation of computational models subject to aleatoric and epistemic uncertainties and sparse stochastic experimental datasets. To demonstrate the method, the classical problem of Taylor impact of a copper bar is simulated. Ensembles of simulations are performed to cover the range of variabilities in the material properties of copper, specifically the nominal yield strength A, the hardening constant B, and the hardening exponent n in a Johnson–Cook material model. To quantify uncertainties in the simulation models, we construct probability density functions (PDFs) of the ratios of the quantities of interest, viz., the final bar diameter Df to the original diameter D0 and the final length Lf to the original length L0. The uncertainties in the experimental data are quantified by constructing target output distributions for these QoIs (Df/D0 and Lf/L0) from the sparse experimental results reported in literature. The simulation output and the experimental output distributions are compared to compute two metrics, viz., the median of the model prediction error and the model confidence at user-specified error level. It is shown that the median is lower and the model confidence is higher for Lf/L0 compared to Df/D0, implying that the simulation models predict the final length of the bar more accurately than the diameter. The calculated confidence levels are shown to be consistent with expectations from the physics of the impact problem and the assumptions in the computational model. Thus, this paper develops and demonstrates physically meaningful metrics for validating simulation models using limited stochastic experimental datasets. The tools and techniques developed in this work can be used for validating a wide range of computational models operating under input uncertainties and sparse experimental datasets.

Relevant
Iterative Most Probable Point Search Method for Problems With a Mixture of Random and Interval Variables

Abstract To represent input variability accurately, an input distribution model for random variables should be constructed using many data. However, for certain input variables, engineers may have only their intervals, which represent input uncertainty. In practical engineering applications, both random and interval variables could exist at the same time. To consider both input variability and uncertainty, inverse reliability analysis should be carried out considering both random and interval variables—mixed variables—and their mathematical correlation in a performance measure. In this paper, an iterative most probable point (MPP) search method has been developed for the mixed-variable problem. The update procedures for MPP search are developed considering the features of mixed variables in the inverse reliability analysis. MPP search for random and interval variables proceed simultaneously to consider the mathematical correlation. An interpolation method is introduced to find a better candidate MPP without additional function evaluations. Mixed-variable design optimization (MVDO) has been formulated to obtain cost-effective and reliable design in the presence of mixed variables. In addition, the design sensitivity of a probabilistic constraint has been developed for an effective and efficient MVDO procedure. Using numerical examples, it is found that the developed MPP search method finds an accurate MPP more efficiently than the generic optimization method does. In addition, it is verified that the developed method enables the MVDO process with a small number of function evaluations.

Relevant
Effects of parametric uncertainty on multi-scale model predictions of shock response of a pressed energetic material

Predictive simulations of shock-to-detonation transitions (SDTs) of energetic materials must contend with uncertainties in the material properties, reactive models, and the microstructures of the material. In this work, we analyze the effects of uncertainties in the run-to-detonation distance h of a pressed energetic (HMX) material due to variabilities in the thermomechanical properties of HMX. The run distances are computed using a recently developed machine-learning based multiscale modeling framework, viz., the Meso-informed Ignition and Growth (MES-IG) model. The input uncertainties are first used in the MES-IG model to quantify the variabilities in the hotspot dynamics at the mesoscale. A Kriging-based Monte Carlo method is used to construct probability density functions (pdfs) for the mesoscale reaction-product formation rates; these are used to propagate the mesoscale uncertainties to the macroscale reaction-progress variables to construct pdfs for the run-to-detonation distance h. We evaluate uncertainties in h due to variabilities in six material properties, viz., specific heat, Grüneisen parameter, bulk modulus, yield strength, thermal expansion coefficient, and the thermal conductivity of the material. Among these six properties, h is found to be most sensitive to the variabilities in the specific heat of the material; the uncertainties in the specific heat amplify exponentially across scales and result in logarithmic pdfs for h. Thus, the paper not only quantifies and propagates uncertainties in material properties across scales in a multiscale model of SDT, but also ranks the properties with respect to the sensitivity of the SDT response of heterogeneous energetic materials on each property.

Open Access
Relevant
Treating Epistemic Uncertainty Using Bootstrapping Selection of Input Distribution Model for Confidence-Based Reliability Assessment

Accurately predicting the reliability of a physical system under aleatory uncertainty requires a very large number of physical output testing. Alternatively, a simulation-based method can be used, but it would involve epistemic uncertainties due to imperfections in input distribution models, simulation models, and surrogate models, as well as a limited number of output testing due to cost. Thus, the estimated output distributions and their corresponding reliabilities would become uncertain. One way to treat epistemic uncertainty is to use a hierarchical Bayesian approach; however, this could result in an overly conservative reliability by integrating possible candidates of input distribution. In this paper, a new confidence-based reliability assessment method that reduces unnecessary conservativeness is developed. The epistemic uncertainty induced by a limited number of input data is treated by approximating an input distribution model using a bootstrap method. Two engineering examples and one mathematical example are used to demonstrate that the proposed method (1) provides less conservative reliability than the hierarchical Bayesian analysis, yet (2) predicts the reliability of a physical system that satisfies the user-specified target confidence level, and (3) shows convergence behavior of reliability estimation as numbers of input and output test data increase.

Relevant
Framework of Reliability-Based Stochastic Mobility Map for Next Generation NATO Reference Mobility Model

A framework for generation of reliability-based stochastic off-road mobility maps is developed to support the next generation NATO reference mobility model (NG-NRMM) using full stochastic knowledge of terrain properties and modern complex terramechanics modeling and simulation capabilities. The framework is for carrying out uncertainty quantification (UQ) and reliability assessment for Speed Made Good and GO/NOGO decisions for the ground vehicle based on the input variability models of the terrain elevation and soil property parameters. To generate the distribution of the slope at given point, realizations of the elevation raster are generated using the normal distribution. For the soil property parameters, such as cohesion, friction, and bulk density, the min and max values obtained from geotechnical databases for each of the soil types are used to generate the normal distribution with a 99% confidence value range. In the framework, the ranges of terramechanics input parameters that will cover the regions of interest are first identified. Within these ranges of input parameters, a dynamic kriging (DKG) surrogate model is obtained for the maximum speed of the nevada automotive test center (NATC) wheeled vehicle platform complex terramechanics model. Finally, inverse reliability analysis using Monte Carlo simulation is carried out to generate the reliability-based stochastic mobility maps for Speed Made Good and GO/NOGO decisions. It is found that the deterministic map of the region of interest has probability of only 25% to achieve the indicated speed.

Open Access
Relevant
Treating Epistemic Uncertainty Using Bootstrapping Selection of Input Distribution Model for Confidence-Based Reliability Assessment

To accurately predict the reliability of a physical system under aleatory (i.e., irreducible) uncertainty in system performance, a very large number of physical output test data is required. Alternatively, a simulation-based method can be used to assess reliability, but it remains a challenge as it involves epistemic (i.e., reducible) uncertainties due to imperfections in input distribution models, simulation models, and surrogate models. In practical engineering applications, only a limited number of tests are used to model input distribution. Thus, estimated input distribution models are uncertain. As a result, estimated output distributions, which are the outcomes of input distributions and biased simulation models, and the corresponding reliabilities also become uncertain. Furthermore, only a limited number of output testing is used due to its cost, which results in epistemic uncertainty. To deal with epistemic uncertainties in prediction of reliability, a confidence concept is introduced to properly assess conservative reliability by considering all epistemic uncertainties due to limited numbers of both input test data (i.e., input uncertainty) and output test data (i.e., output uncertainty), biased simulation models, and surrogate models. One way to treat epistemic uncertainties due to limited numbers of both input and output test data and biased models is to use a hierarchical Bayesian approach. However, the hierarchical Bayesian approach could result in an overly conservative reliability assessment by integrating possible candidates of input distribution using a Bayesian analysis. To tackle this issue, a new confidence-based reliability assessment method that reduces unnecessary conservativeness is developed in this paper. In the developed method, the epistemic uncertainty induced by a limited number of input data is treated by approximating an input distribution model using a bootstrap method. Two engineering examples are used to demonstrate that 1) the proposed method can predict the reliability of a physical system that satisfies the user-specified target confidence level and 2) the proposed confidence-based reliability is less conservative than the one that fully integrates possible candidates of input distribution models in the hierarchical Bayesian analysis.

Relevant
Evaluation of multifidelity surrogate modeling techniques to construct closure laws for drag in shock–particle interactions

Meta- (or surrogate-) models constructed from meso-scale simulations can be used in place of empirical correlations to close macro-scale equations. In shocked particulate flows, surrogate models for drag are constructed as functions of shock Mach number (Ma), particle volume fraction (ϕ), Reynolds number (Re), etc. The computational cost of the high-fidelity meso-scale simulations is a challenge in construction of surrogates in such hierarchical multi-scale frameworks. Here multifidelity surrogate-modeling techniques are evaluated as inexpensive alternatives to high-fidelity surrogate models for obtaining closure laws for drag in shock–particle interactions. Preliminary surrogates for drag as a function of Ma and ϕ are constructed from ensembles of low-fidelity (coarse grid) mesoscale computations. The low-fidelity surrogates are subsequently corrected using only a few (Nhf) high-fidelity computations to obtain multifidelity surrogate models. The paper evaluates three different methods for correcting an initial low-fidelity surrogate; Space Mapping (SM), Radial Basis Functions (RBF) and Modified Bayesian Kriging (MBKG). Of these methods, MBKG is found to provide the best multi-fidelity surrogate model, simultaneously minimizing the computational cost and error in the constructed surrogate.

Open Access
Relevant
Confidence-based reliability assessment considering limited numbers of both input and output test data

Simulation-based methods can be used for accurate uncertainty quantification and prediction of the reliability of a physical system under the following assumptions: (1) accurate input distribution models and (2) accurate simulation models (including accurate surrogate models if utilized). However, in practical engineering applications, often only limited numbers of input test data are available for modeling input distribution models. Thus, estimated input distribution models are uncertain. In addition, the simulation model could be biased due to assumptions and idealizations used in the modeling process. Furthermore, only a limited number of physical output test data is available in the practical engineering applications. As a result, target output distributions, against which the simulation model can be validated, are uncertain and the corresponding reliabilities become uncertain as well. To assess the conservative reliability of the product properly under the uncertainties due to limited numbers of both input and output test data and a biased simulation model, a confidence-based reliability assessment method is developed in this paper. In the developed method, a hierarchical Bayesian model is formulated to obtain the uncertainty distribution of reliability. Then, we can specify a target confidence level. The reliability value at the target confidence level using the uncertainty distribution of reliability is the confidence-based reliability, which is the confidence-based estimation of the true reliability. It has been numerically demonstrated that the proposed method can predict the reliability of a physical system that satisfies the user-specified target confidence level, using limited numbers of input and output test data.

Relevant
Reliability-based design optimization of wind turbine drivetrain with integrated multibody gear dynamics simulation considering wind load uncertainty

This study aims to develop an integrated computational framework for the reliability-based design optimization (RBDO) of wind turbine drivetrains to assure the target reliability under wind load and gear manufacturing uncertainties. Gears in wind turbine drivetrains are subjected to severe cyclic loading due to highly variable wind loads that are stochastic in nature. Thus, the failure rate of drivetrain systems is reported to be higher than the other wind turbine components, and improving drivetrain reliability is critically important in reducing downtime caused by gear failures. In the numerical procedure developed in this study, a wide spatiotemporal variability for wind loads is considered using 249 sets of wind data to evaluate probabilistic contact fatigue life in the sampling-based RBDO. To account for wind load uncertainty in evaluation of the tooth contact fatigue, multiple drivetrain dynamics simulations need to be run under various wind load scenarios in the RBDO process. For this reason, a numerical procedure based on the multivariable tabular contact search algorithm is applied to the modeling of wind turbine drivetrains to reduce the overall computational time while retaining the precise contact geometry required for considering the gear tooth profile optimization. An integrated computational framework for the wind turbine drivetrain RBDO is then developed by incorporating the wind load uncertainty, the rotor blade aerodynamics model, the drivetrain dynamics model, and the probabilistic contact fatigue failure model. It is demonstrated that the RBDO optimum for a 750 kW wind turbine drivetrain obtained using the procedure developed in this study can achieve the target 97.725% reliability (2 sigma quality level) with only a 1.4% increase in the total weight from the baseline design, which had a reliability of 8.3%. Furthermore, it is shown that the tooth profile optimization, tip relief introduced as a design variable, prevents a large increase of the face width that would result in a large increase in the weight (cost) of the drivetrain in order to satisfy the target reliability against the tooth contact fatigue failure.

Relevant