Abstract
In the context of time-domain simulation of integrated circuits, one often encounters large systems of coupled differential-algebraic equations. Simulation costs of these systems can become prohibitively large as the number of components keeps increasing. In an effort to reduce these simulation costs a twofold approach is presented in this paper. We combine maximum entropy snapshot sampling method and a nonlinear model order reduction technique, with multirate time integration. The obtained model order reduction basis is applied using the Gauß-Newton method with approximated tensors reduction. This reduction framework is then integrated using a coupled-slowest-first multirate integration scheme. The convergence of this combined method is verified numerically. Lastly it is shown that the new method results in a reduction of the computational effort without significant loss of accuracy.
Submitted Version (
Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have