Abstract

This paper presents data-driven learning of localized reduced models. Instead of a global reduced basis, the approach employs multiple local approximation subspaces. This localization permits adaptation of a reduced model to local dynamics, thereby keeping the reduced dimension small. This is particularly important for reduced models of nonlinear systems of partial differential equations, where the solution may be characterized by different physical regimes or exhibit high sensitivity to parameter variations. The contribution of this paper is a non-intrusive approach that learns the localized reduced model from snapshot data using operator inference. In the offline phase, the approach partitions the state space into subregions and solves a regression problem to determine localized reduced operators. During the online phase, a local basis is chosen adaptively based on the current system state. The non-intrusive nature of localized operator inference makes the method accessible, portable and applicable to a broad range of scientific problems, including those that use proprietary or legacy high-fidelity codes. We demonstrate the potential for achieving large computational speedups while maintaining good accuracy for a Burgers' equation governing shock propagation in a one-dimensional domain and a phase-field problem governed by the Cahn-Hilliard equation. This article is part of the theme issue 'Data-driven prediction in dynamical systems'.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call