Abstract

The integration of more components into modern integrated circuits (ICs) has led to very large RLC parasitic networks consisting of millions of nodes that have to be simulated in many times or frequencies to verify the proper operation of the chip. Model order reduction (MOR) techniques have been employed routinely to substitute the large-scale parasitic model with a model of lower order with a similar response at the input–output ports. However, established MOR techniques generally result in dense system matrices that render their simulation impractical. To this end, in this article, we propose a methodology for the sparsification of the dense circuit matrices resulting from MOR of general RLC circuits, which employs a sequence of algorithms based on the computation of the nearest diagonally dominant matrix and the sparsification of the corresponding graph. In addition, we describe a procedure for synthesizing the sparsified reduced-order model into an RLC circuit with only positive elements. Experimental results indicate that a high sparsity ratio of the reduced system matrices can be achieved with very small loss of accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.