Accurate simulation of large interconnect networks has become a necessity to address signal-integrity issues in current high-speed very-large-scale-integration designs. To accurately characterize a dispersive system of interconnects at higher frequencies, a full-wave analysis is required. However, conventional circuit simulation of interconnects with full-wave models is extremely CPU expensive. Recently published moment-matching techniques provide a generalized approach to lumped/distributed circuit response approximations. However, these techniques are based on quasi-transverse electromagnetic mode (TEM) assumption and have no mechanism to handle full-wave models. In this paper, we present a new method to extend model-reduction techniques for simulation of full-wave models. The following three new results are presented in this paper: 1) a generalized method to combine modal results from a full-wave analysis into circuit simulators; 2) a new algorithm for moment generation involving full-wave models; 3) deviations associated with quasi-TEM approximations compared to full-wave models at higher frequencies. The proposed algorithm yields a speed up of 1 to 2 orders of magnitude for a comparable accuracy with conventional techniques. In addition, the proposed method can be used for a mixed simulation involving distributed models with frequency dependent/independent RLCG parameters, full-wave interconnect models and measured subnetworks along with nonlinear terminations.
Read full abstract