Abstract
Many prior works have investigated electromigration (EM) on full-chip power grid interconnects, which has become one of major reliability concerns in nanometer VLSI design. However, most of the published results were obtained under the assumption of uniformly distributed temperature and/or residual stress across interconnects. In this paper, we demonstrate the implementation of novel methodology and flow for full-chip EM assessment on the multi-layered power grid networks of a 32nm test-chip and investigate the impacts of the within-die temperature and thermal stress variations on the failure rate. The proposed approach is based on recently developed physics-based EM models and the EM-induced IR-drop degradation criterion that replaces the traditional conservative weakest segment method. The cross-layout temperature distribution caused by power dissipations in devices and by interconnect Joule heating has been characterized and taken into account in the full-chip EM assessment methodology. Results of the simulations performed on the analyzed multi-layered power/ground nets show that traditional assumption of the uniform average temperature leads to inaccurate predictions of the time-to-failure (TTF). Furthermore, the consideration of thermal stress variation results in a retarded EM induced degradation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.