Abstract

We present a class of efficient parametric closure models for 1D stochastic Burgers equations. Casting it as statistical learning of the flow map, we derive the parametric form by representing the unresolved high wavenumber Fourier modes as functionals of the resolved variable’s trajectory. The reduced models are nonlinear autoregression (NAR) time series models, with coefficients estimated from data by least squares. The NAR models can accurately reproduce the energy spectrum, the invariant densities, and the autocorrelations. Taking advantage of the simplicity of the NAR models, we investigate maximal space-time reduction. Reduction in space dimension is unlimited, and NAR models with two Fourier modes can perform well. The NAR model’s stability limits time reduction, with a maximal time step smaller than that of the K-mode Galerkin system. We report a potential criterion for optimal space-time reduction: the NAR models achieve minimal relative error in the energy spectrum at the time step, where the K-mode Galerkin system’s mean Courant–Friedrichs–Lewy (CFL) number agrees with that of the full model.

Highlights

  • Closure modeling aims for computationally efficiently reduced models for tasks requiring repeated simulations such as Bayesian uncertainty quantification [1,2] and data assimilation [3,4].Consisting of low-dimensional resolved variables, the closure model must take into account the non-negligible effects of unresolved variables so as to capture both the short-time dynamics and large-time statistics

  • We test the nonlinear autoregression (NAR) models in four settings: reduction of deterministic responses (K > K0 ) vs. reduction involving unresolved stochastic force (K < K0 ), and small vs. large scales of stochastic force, where K0 is the number of Fourier modes of the white-in-time stochastic force and σ is the scale of the force

  • We explore the maximal time step that NAR models can reach by testing time steps δ = dt × {5, 10, 20, 30, 40, 50, 80, 160}

Read more

Summary

Introduction

Closure modeling aims for computationally efficiently reduced models for tasks requiring repeated simulations such as Bayesian uncertainty quantification [1,2] and data assimilation [3,4]. The past decades witness revolutionary developments of data-driven strategies, ranging from parametric models (see, e.g., [8,9,10,11,12,13,14] and the references therein) to nonparametric and machine learning methods (see, e.g., [15,16,17,18]) These developments demand a systematic understanding of model reduction from the perspectives of dynamical systems (see, e.g., [7,19,20]), numerical approximation [21,22], and statistical learning [17,23]. Both approaches approximate the target stochastic process: the POD-ROMs are based on Karhunen-Loéve expansion, while the inference-based closure models aim to learn the nonlinear flow-map. Following a brief review of the basic properties of the stochastic Burgers equation and its numerical integration, we introduce in Section 2 the inference approach to closure modeling and compare it with the nonlinear Galerkin methods. The viscosity in (1) and the strength of the stochastic force number of modes and time step-size in numerical solution

The Stochastic Burgers Equationation
Galerkin Spectral Method
Nonlinear Galerkin and Inferential Model Reduction
Derivation of Parametric Reduced Models
The Numerical Reduced Model in Fourier Modes
Data Generation and Parameter Estimation
Model Selection
Numerical Study on Space-Time Reduction
Settings
Model Selection and Memory Length
1: Relative error in in energy bythe theNAR
Energy spectrum of NAR models with
Reduction
Reduction Involving Unresolved Stochastic Force next
5: Marginal
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call