We develop fast and scalable methods for computing reduced-order nonlinear solutions (RONS). RONS was recently proposed as a framework for reduced-order modeling of time-dependent partial differential equations (PDEs), where the modes depend nonlinearly on a set of time-varying parameters. RONS uses a set of ordinary differential equations (ODEs) for the parameters to optimally evolve the shape of the modes to adapt to the PDE's solution. This method has already proven extremely effective in tackling challenging problems such as advection-dominated flows and high-dimensional PDEs. However, as the number of parameters grow, integrating the RONS equation and even its formation become computationally prohibitive. Here, we develop three separate methods to address these computational bottlenecks: symbolic RONS, collocation RONS and regularized RONS. We demonstrate the efficacy of these methods on two examples: Fokker–Planck equation in high dimensions and the Kuramoto–Sivashinsky equation. In both cases, we observe that the proposed methods lead to several orders of magnitude in speedup and accuracy. Our proposed methods extend the applicability of RONS beyond reduced-order modeling by making it possible to use RONS for accurate numerical solution of linear and nonlinear PDEs. Finally, as a special case of RONS, we discuss its application to problems where the PDE's solution is approximated by a neural network, with the time-dependent parameters being the weights and biases of the network. The RONS equations dictate the optimal evolution of the network's parameters without requiring any training.
Read full abstract