Abstract

The success of trace buffer based techniques in post-silicon validation largely relies on selection of trace signals. Previous works have mainly focused on improving the metric of state restoration ratio. This metric though good at measuring enhancement of internal states visibility, does not reflect the effectiveness of selected trace signals in localizing post-silicon errors. It leads to increase in time devoted to analysis of debug data resulting in slowdown of validation process. This paper suggests an error detection utility based selection of trace signals. The proposed approach attempts to minimize error detection latency by suitably selecting trace signals based on simulation of the design with randomly injected errors. A methodology is proposed for ranking of different trace signals so as to detect bugs as early as possible. A variant of this ranking is also proposed which accounts for the ability of trace signals to detect maximum number of bugs with minimum overall detection latency. Experiments with large benchmark circuits show that the proposed approach is able to drastically minimize latency and maximize circuit misbehavior detection as compared to techniques which attempt to maximize state restoration ratio.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.