Abstract

In many applications throughout science and engineering, model reduction plays an important role replacing expensive large-scale linear dynamical systems by inexpensive reduced order models that capture key features of the original, full order model. One approach to model reduction finds reduced order models that are locally optimal approximations in the $\mathcal{H}_2$ norm, an approach taken by the Iterative Rational Krylov Algorithm (IRKA) among others. Here we introduce a new approach for $\mathcal{H}_2$-optimal model reduction using the projected nonlinear least squares framework previously introduced in [J. M. Hokanson, SIAM J. Sci. Comput. 39 (2017), pp. A3107--A3128]. At each iteration, we project the $\mathcal{H}_2$ optimization problem onto a finite-dimensional subspace yielding a weighted least squares rational approximation problem. Subsequent iterations append this subspace such that the least squares rational approximant asymptotically satisfies the first order necessary conditions of the original, $\mathcal{H}_2$ optimization problem. This enables us to build reduced order models with similar error in the $\mathcal{H}_2$ norm but using far fewer evaluations of the expensive, full order model compared to competing methods. Moreover, our new algorithm only requires access to the transfer function of the full order model, unlike IRKA which requires a state-space representation or TF-IRKA which requires both the transfer function and its derivative. Applying the projected nonlinear least squares framework to the $\mathcal{H}_2$-optimal model reduction problem open new avenues for related model reduction problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call