Abstract

As chip integration continues to increase and technology scaling is forcing the operating voltage to decrease, modern designs have become more susceptible to supply voltage noise. However, even with a well designed power distribution network, modern at-speed test pattern generation techniques do not consider the maximum current throughput the network will be able to provide. As a result, conventional transition delay fault pattern generation tends to create a number of patterns that cause higher-than-average functional switching, which may cause timing and/or functional failures during test. In this paper, we propose a flow that incorporates the layout information and the locality of the switching activity during pattern generation to provide insight into the amount of tolerable switching. This will prevent both IR-drop related hot-spots and under-utilization of the chip since the switching activity can be evenly spread across the design. The results presented in this paper show significant improvement over our previous flow without negatively impacting fault coverage and pattern count.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.