Abstract
Power induced over-testing typically occurs when fully functional chips failed during testing because of excessive IR-drop and/or power supply noise caused by scan-based at-speed test patterns. Over the years, this problem has been tackled by either improving automatic test pattern generation (ATPG) algorithms or low power design-for-test (DFT) techniques that typically requires hardware changes. The paper presents a methodology to address the excessive voltage droop during at-speed transition delay fault testing. A partition-based low capture power adaptive test procedure is proposed in three steps. First, identification of the physical partitions of local hot spots, Second, extract partition-based low capture power constraints. Third, the extracted constraints are given to the ATPG engine for generating at-speed patterns that satisfy the local hot spots power constraints, while minimizing the pattern count increase. Post-silicon Vmin measurements on various designs illustrates the effectiveness of the proposed solution.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.