Variational Path Sampling of Rare Dynamical Events.
This article reviews the concepts and methods of variational path sampling. These methods allow computational studies of rare events in systems driven arbitrarily far from equilibrium. Based upon a statistical mechanics of trajectory space and leveraging the theory of large deviations, they provide a perspective from which dynamical phenomena can be studied with the same types of ensemble reweighting ideas that have been used for static equilibrium properties. Applications to chemical, material, and biophysical systems are highlighted.
- Research Article
- 10.2352/issn.2470-1173.2016.19.coimg-156
- Feb 14, 2016
- Electronic Imaging
Certain events of low probability that occur in material systems have a considerable impact on system characterization. Though rare event simulation has been a well-researched problem in areas like financial risk assessment and communication systems, modeling and simulation of rare events in material systems remain under-explored.In this paper, we turn to large deviations theory and importance sampling to develop a solution to simulate an important rare event that arises in polycrystalline materials. More specifically, the microstructure of a polycrystalline material consists of grains that have different orientations associated with them. These grains evolve over time and this phenomenon is called grain growth. The growth of grains is a slow process, making direct observation expensive and impractical. To alleviate this problem, computational methods have been developed to simulate grain growth. However, one event of interest which occurs with low probability involves a single grain that grows abnormally large at the expense of other grains. Though Gibbs distribution based models exist for such abnormal grain growth, occurrence of this event under such models is still rare enough that we still need to draw many samples before an abnormal growth manifests.We propose an importance sampling distribution from which to draw samples to simulate abnormal grain growth, instead of the conventional Gibbs distribution used to model grain growth. Our proposed importance sampling distribution is based on an asymptotically efficient rare event probability estimator. With our method, we consistently generate abnormal grain growth, thus providing a reliable solution to this important materials science problem. Our solution can potentially be used in general to simulate rare events in any system that is modeled by a Gibbs distribution.
- Conference Article
4
- 10.1109/wts.2011.5960826
- Apr 1, 2011
The paper recommends an effective approach to estimate probability of buffer overflow in wireless telecommunications networks. The buffer overflow probability in queuing systems is defined as a rare event and can be estimated using rare event simulation with Markov chains. Two-node queuing networks are considered in this paper; and an event of buffer overflow at the second node is studied. Probability of a rare event that the content of the second buffer would exceed some high level L, starting from a certain state, is analyzed. The approach is based on Markov additive representation of the buffer processes, leading to exponential change of measure, which is used in an Importance sampling method. The examples, considered in this paper, confirm that when the first buffer is finite, the relative error is bound independent of some high level L. However, when the first buffer is infinite, a natural extension of exponential change of measure for finite buffer case is proposed. The relative error is shown to be bound independent of L only when at the second node is a bottleneck, i.e. buffer overflow may occur. However, when at the first node is a bottleneck, experimental results confirm that the relative error is linearly bound to the level L. Two efficient rare event simulation algorithms, based on the Importance sampling and Cross-entropy methods, are developed and applied to accelerate the overflow probability simulation with Markov chain modeling in wireless telecommunications networks. Numerical examples and simulation results are provided.
- Abstract
- 10.1016/j.bpj.2013.11.2132
- Jan 1, 2014
- Biophysical Journal
Accelerating Systems Biology Computation: Rapid Estimation of Equilibrium and Kinetic Quantities via Weighted Ensemble Sampling
- Research Article
9
- 10.1021/acs.jctc.2c01088
- Mar 1, 2023
- Journal of Chemical Theory and Computation
We present a rare event sampling scheme applicable to coupled electronic excited states. In particular, we extend the forward flux sampling (FFS) method for rare event sampling to a nonadiabatic version (NAFFS) that uses the trajectory surface hopping (TSH) method for nonadiabatic dynamics. NAFFS is applied to two dynamically relevant excited-state models that feature an avoided crossing and a conical intersection with tunable parameters. We investigate how nonadiabatic couplings, temperature, and reaction barriers affect transition rate constants in regimes that cannot be otherwise obtained with plain, traditional TSH. The comparison with reference brute-force TSH simulations for limiting cases of rareness shows that NAFFS can be several orders of magnitude cheaper than conventional TSH and thus represents a conceptually novel tool to extend excited-state dynamics to time scales that are able to capture rare nonadiabatic events.
- Research Article
21
- 10.1016/j.cplett.2015.05.011
- May 22, 2015
- Chemical Physics Letters
Classical-quantum correspondence in a model for conformational dynamics: Connecting phase space reactive islands with rare events sampling
- Research Article
17
- 10.1016/j.cpc.2014.03.013
- Mar 21, 2014
- Computer Physics Communications
The Flexible Rare Event Sampling Harness System (FRESHS)
- Preprint Article
- 10.5194/egusphere-egu23-8340
- May 15, 2023
Understanding extreme events and their probability is key for the study of climate change impacts, risk assessment, adaptation, and the protection of living beings. Extreme heatwaves are, and likely will be in the future, among the deadliest weather events. Forecasting their occurrence probability a few days, weeks, or months in advance is a primary challenge for risk assessment and attribution, but also for fundamental studies about processes, dataset or model validation, and climate change studies.       Because of a lack of data related to a too short historical record, the rarity of the events, and of the difficulty to obtain rare events in climate models, uncertainty quantification is extremely difficult for extreme events. We develop a methodology to tackle this problem by combining probabilistic machine learning using deep neural network and rare event simulations.We will first demonstrate that deep neural networks can predict the probability of occurrence of long lasting 14-day heatwaves over France, up to 15 days ahead of time for fast dynamical drivers (500 hPa geopotential height fields), and at much longer lead times for slow physical drivers (soil moisture). This forecast is made seamlessly in time and space, for fast hemispheric and slow local drivers.A key scientific message is that training deep neural networks for predicting extreme heatwaves occurs in a regime of drastic lack of data. We suggest that this is likely the case for most other applications of machine learning to large scale atmosphere and climate phenomena. We discuss perspectives for dealing with this lack of data issue, for instance using rare event simulations.Rare event simulations are a very efficient tool to oversample drastically the statistics of rare events. Using a climate model, with this tool we obtain several orders of magnitude more extreme heat waves compared to a control run. We will discuss the coupling of machine learning approaches, for instance the analogue method, with rare event simulations, and discuss their efficiency and their future interest for climate simulations. 
- Research Article
4
- 10.1007/s10009-022-00675-x
- Oct 1, 2022
- International Journal on Software Tools for Technology Transfer
Dynamic fault trees (DFTs) are widely adopted in industry to assess the dependability of safety-critical equipment. Since many systems are too large to be studied numerically, DFTs dependability is often analysed using Monte Carlo simulation. A bottleneck here is that many simulation samples are required in the case of rare events, e.g. in highly reliable systems where components seldom fail. Rare event simulation (RES) provides techniques to reduce the number of samples in the case of rare events. In this article, we present a RES technique based on importance splitting to study failures in highly reliable DFTs, more precisely, on a variant of repairable fault trees (RFT). Whereas RES usually requires meta-information from an expert, our method is fully automatic. For this, we propose two different methods to derive the so-called importance function. On the one hand, we propose to cleverly exploit the RFT structure to compositionally construct such function. On the other hand, we explore different importance functions derived in different ways from the minimal cut sets of the tree, i.e., the minimal units that determine its failure. We handle RFTs with Markovian and non-Markovian failure and repair distributions—for which no numerical methods exist—and implement the techniques on a toolchain that includes the RES engine FIG, for which we also present improvements. We finally show the efficiency of our approach in several case studies.
- Research Article
7
- 10.1504/ijmndi.2012.054449
- Jan 1, 2012
- International Journal of Mobile Network Design and Innovation
The paper recommends an effective approach to estimate probability of buffer overflow in wireless communication networks. The buffer overflow probability in queuing systems is defined as a rare event and can be estimated using rare event simulation with Markov chains. Two-node queuing networks are considered in this paper; and an event of buffer overflow at the second node is studied. Two efficient rare event simulation algorithms, based on the importance sampling and cross-entropy methods, are developed and applied to accelerate the overflow probability simulation with Markov chain modelling. Numerical examples and simulation results are provided.
- Research Article
3
- 10.1007/s40072-017-0100-y
- May 22, 2017
- Stochastics and Partial Differential Equations: Analysis and Computations
The goal of this paper is to develop provably efficient importance sampling Monte Carlo methods for the estimation of rare events within the class of linear stochastic partial differential equations. We find that if a spectral gap of appropriate size exists, then one can identify a lower dimensional manifold where the rare event takes place. This allows one to build importance sampling changes of measures that perform provably well even pre-asymptotically (i.e. for small but non-zero size of the noise) without degrading in performance due to infinite dimensionality or due to long simulation time horizons. Simulation studies supplement and illustrate the theoretical results.
- Book Chapter
- 10.1007/978-3-030-51264-4_7
- Jan 1, 2020
Rare weather and climate events, such as heat waves and floods, can bring tremendous social costs. Climate data is often limited in duration and spatial coverage, and so climate forecasting has often turned to simulations of climate models to make better predictions of rare weather events. However very long simulations of complex models, in order to obtain accurate probability estimates, may be prohibitively slow. It is an important scientific problem to develop probabilistic and dynamical techniques to estimate the probabilities of rare events accurately from limited data. In this paper we compare four modern methods of estimating the probability of rare events: the generalized extreme value (GEV) method from classical extreme value theory; two importance sampling techniques, geneaological particle analysis (GPA) and the Giardina-Kurchan-Lecomte-Tailleur (GKLT) algorithm; as well as brute force Monte Carlo (MC). With these techniques we estimate the probabilities of rare events in three dynamical models: the Ornstein-Uhlenbeck process, the Lorenz ’96 system and PlaSim (a climate model). We keep the computational effort constant and see how well the rare event probability estimation of each technique compares to a gold standard afforded by a very long run control. Somewhat surprisingly we find that classical extreme value theory methods outperform GPA, GKLT and MC at estimating rare events.
- Research Article
- 10.1177/1548512920934551
- Jun 27, 2020
- The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology
Inherent vulnerabilities in a cyber network’s constituent machine services can be exploited by malicious agents. As a result, the machines on any network are at risk. Security specialists seek to mitigate the risk of intrusion events through network reconfiguration and defense. When dealing with rare cyber events, high-quality risk estimates using standard simulation approaches may be unattainable, or have significant attached uncertainty, even with a large computational simulation budget. To address this issue, an efficient rare event simulation modeling and analysis technique, namely, importance sampling for cyber networks, is developed. The importance sampling method parametrically amplifies certain aspects of the network in order to cause a rare event to happen more frequently. Output collected under these amplified conditions is then scaled back into the context of the original network to provide meaningful statistical inferences. The importance sampling methodology is tailored to cyber network attacks and takes the attacker’s successes and failures as well as the attacker’s targeting choices into account. The methodology is shown to produce estimates of higher quality than standard simulation with greater computational efficiency.
- Dissertation
- 10.5451/unibas-007058148
- Jan 1, 2016
In this PhD Thesis molecular systems off increasing size and complexity are investigated, using both standard sampling and advanced sampling methods. The implementation and validation of two of those rare events sampling methods is described, namely the SA-MC and PINS algorithm. The development and use of a toolkit for fitting force fields parameters (for the Lennard-Jones and Multipoles parameters), the Fitting Wizard, is presented. The stability of the Haemoglobin tetramer is also investigated in solution using standard Molecular Dynamics. The two first Chapters introduce the necessary theoretical background, and are followed by the results sections containing the articles written during this PhD.
- Research Article
2
- 10.1088/1742-6596/2096/1/012151
- Nov 1, 2021
- Journal of Physics: Conference Series
The paper presents algorithms for simulation rare events in stochastic systems based on the theory of large deviations. Here, this approach is used in conjunction with the tools of optimal control theory to estimate the probability that some observed states in a stochastic system will exceed a given threshold by some upcoming time instant. Algorithms for obtaining controlled extremal trajectory (A-profile) of the system, along which the transition to a rare event (threshold) occurs most likely under the influence of disturbances that minimize the action functional, are presented. It is also shown how this minimization can be efficiently performed using numerical-analytical methods of optimal control for linear and nonlinear systems. These results are illustrated by an example for a precipitation-measured monsoon intraseasonal oscillation (MISO) described by a low-order nonlinear stochastic model.
- Research Article
48
- 10.1063/1.3525099
- Dec 22, 2010
- The Journal of Chemical Physics
Although many computational methods for rare event sampling exist, this type of calculation is not usually practical for general nonequilibrium conditions, with macroscopically irreversible dynamics and away from both stationary and metastable states. A novel method for calculating the time-series of the probability of a rare event is presented which is designed for these conditions. The method is validated for the cases of the Glauber-Ising model under time-varying shear flow, the Kawasaki-Ising model after a quench into the region between nucleation dominated and spinodal decomposition dominated phase change dynamics, and the parallel open asymmetric exclusion process. The method requires a subdivision of the phase space of the system: it is benchmarked and found to scale well for increasingly fine subdivisions, meaning that it can be applied without detailed foreknowledge of the physically important reaction pathways.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.