Abstract

In virtualizing engineered systems, it is essential to come up with simulators that are essentially capable of representing the system in its “as-deployed” state. Any attempt to this end may only be approximate given the inherent uncertainties present in the loadings and operational conditions of the system, as well as the configuration of the system itself (geometry, materials, control systems, boundary conditions, etc.). This is especially true for complex systems, such as wind turbines, where often a number of assumptions govern the setup of the engineering models. Such models are often made available at different granularities with each one offering a diversified level of precision depending on the quantity of interest (e.g. macroscopic displacements or microscopic strains) and the properties of the acting loads (e.g. amplitude and frequency content). This implies that the predictive capabilities are severely hampered when a single so-deemed best model is chosen for simulation. Building on this idea, we here present a method for fusing the outputs from multiple simulators (e.g. aero-servo-hydro-elastic simulators) for estimating a quantity of interest (QoI) with higher precision. The proposed ensemble learning approach comprises two main building blocks. Firstly, a clustering step by means of a Variational Bayesian Gaussian mixture model, employed for the weighing of each available simulator. Clustering is performed on the basis of the binned input space, which allows for extraction of a probability map for each local region of the binned input space. This delivers an adaptive scheme, which allows different simulators to more or less prominently contribute to the prediction of the QoI, depending on the range of the input parameters. Local weighted Bootstrap Aggregation is then executed in a second step for combining the clustered ensemble of outputs from the individual simulators. A simulated toy example and a wind turbine blade fatigue case study are herein exploited to demonstrate the efficacy of the suggested ensemble learning scheme. The approach is compared against alternatives typically adopted in existing literature, such as Stacking, classical Bagging, and Bayesian Model Averaging. The results confirm an improvement in predictive capabilities as expressed via the reduction in the generalization error and the narrowing of the associated confidence interval.

Highlights

  • The predictions from stochastic numerical simulators used in the design of machines and structures are subject to uncertainties

  • Concluding Discussion We proposed an ensemble learning framework, termed UnLoCWeB, which relies on unsupervised variational Bayesian Gaussian mixture clustering and local weighted bootstrap aggregation of outputs from individual simulators

  • The framework is here illustrated in the context of predictions for engineering systems where response variability is noted across the range of operational loads, and where the assumed properties of the engineering models can influence response predictions

Read more

Summary

Introduction

The predictions from stochastic numerical simulators used in the design of machines and structures are subject to uncertainties. Realizing that the range of inputs is a salient parameter to what concerns modeling of wind turbine systems, we here present an unsupervised ensemble learning method, which treats individual simulator outputs as components contributing to an ensemble estimate, as opposed to competing predictions. Method The method exploits the probabilistic output form a diversified set of individual simulators (engineering models of different granularity) It proceeds to firstly extract the local weights of these simulators via an unsupervised variational Bayesian Gaussian mixture (VBGM) scheme, followed by estimation of a weighted ensemble, delivered by Bootstrap Aggregation (or weighted Bagging). NSi |Cj,∆x NCj |∆x where, NSi |Cj ,∆x is the number of data points in cluster Cj corresponding to simulator Si. Local cluster-weighted Bootstrap Aggregating (Bagging) follows [10], to fuse the ensemble of outputs from each bin - based on the previously extracted weights - into one single aggregated predictor. The final local cluster-weighted bagged estimator is obtained as: gw,bagg

Analytical computational experiment
Findings
Practical Case Study
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.