Abstract

A novel methodology for electromigration (EM) failure assessment in power/ground nets of integrated circuits, which is based on analysis of IR drop degradation, has been enhanced by considering non-uniform temperature distribution in interconnects. Temperature gradient along an interconnect tree can affect void nucleation due to the divergency of atomic flux, as well as due to development of thermal stress. Compact models for resistance increase of voided metal have been developed, using results of FEM simulations of voiding kinetics in via-metal structures. The simulation approach has been validated using measurements of node voltages in a specially designed test power grid. Increase of anode-cathode voltage drop above a threshold value was adopted as a failure criterion determining time-to-failure (TTF) of the grid and mean-time-to-failure (MTTF) was obtained, assuming random distributions of the critical stress and atomic diffusivity. In the studied cases of temperature distributions, a good agreement of simulated and measured TTF distributions has been obtained.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.