Abstract

The variational multiscale (VMS) formulation formally segregates the evolution of the coarse-scales from the fine-scales. VMS modeling requires the approximation of the impact of the fine scales in terms of the coarse scales. In linear problems, our formulation reduces the problem of learning the sub-scales to learning the projected element Green's function basis coefficients. For the purpose of this approximation, a special neural-network structure - the variational super-resolution N-N (VSRNN) - is proposed. The VSRNN constructs a super-resolved model of the unresolved scales as a sum of the products of individual functions of coarse scales and physics-informed parameters. Combined with a set of locally non-dimensional features obtained by normalizing the input coarse-scale and output sub-scale basis coefficients, the VSRNN provides a general framework for the discovery of closures for both the continuous and the discontinuous Galerkin discretizations. By training this model on a sequence of $L_2-$projected data and using the subscale to compute the continuous Galerkin subgrid terms, and the super-resolved state to compute the discontinuous Galerkin fluxes, we improve the optimality and the accuracy of these methods for the convection-diffusion problem, linear advection and turbulent channel flow. Finally, we demonstrate that - in the investigated examples - the present model allows generalization to out-of-sample initial conditions and Reynolds numbers. Perspectives are provided on data-driven closure modeling, limitations of the present approach, and opportunities for improvement.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call