Abstract
AbstractIn this article, we introduce a methodology for improving the accuracy and efficiency of reduced order models (ROMs) constructed using the least‐squares Petrov–Galerkin (LSPG) projection method through the introduction of preconditioning. Unlike prior related work, which focuses on preconditioning the linear systems arising within the ROM numerical solution procedure to improve linear solver performance, our approach leverages a preconditioning matrix directly within the minimization problem underlying the LSPG formulation. Applying preconditioning in this way has the potential to improve ROM accuracy for several reasons. First, preconditioning the LSPG formulation changes the norm defining the residual minimization, which can improve the residual‐based stability constant bounding the ROM solution's error. The incorporation of a preconditioner into the LSPG formulation can have the additional effect of scaling the components of the residual being minimized to make them roughly of the same magnitude, which can be beneficial when applying the LSPG method to problems with disparate scales (e.g., dimensional equations, multi‐physics problems). Importantly, we demonstrate that an “ideal preconditioned” LSPG ROM (a ROM in which the preconditioner is the inverse of the Jacobian of its corresponding full order model) emulates projection of the full order model solution increment onto the reduced basis. This quantity defines a lower bound on the error of a ROM solution for a given reduced basis. By designing preconditioners that approximate the Jacobian inverse—as is common in designing preconditioners for solving linear systems—it is possible to obtain a ROM whose error approaches this lower bound. The proposed approach is evaluated on several mechanical and thermo‐mechanical problems implemented within the Albany HPC code and run in the predictive regime, with prediction across material parameter space. We demonstrate numerically that the introduction of simple Jacobi, Gauss‐Seidel, and ILU preconditioners into the proper orthogonal decomposition/LSPG formulation reduces significantly the ROM solution error, the reduced Jacobian condition number, the number of nonlinear iterations required to reach convergence, and the wall time (thereby improving efficiency). Moreover, our numerical results reveal that the introduction of preconditioning can deliver a robust and accurate solution for test cases in which the unpreconditioned LSPG method fails to converge.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal for Numerical Methods in Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.