Abstract
Circuit analysis with respect to aging-induced degradation is critical to ensure correct operation throughout the entire lifetime of a chip. However, state-of-the-art techniques only allow for the consideration of uniformly applied degradation, despite the fact that different workloads will lead to different degradations due to the different induced activities. This imposes over-pessimism in estimating the required timing guardbands, resulting in unnecessary losses of performance and efficiency. In this work, we propose an approach that takes real-world workload dependencies into account and generates workload-specific aging-aware standard cell libraries. This allows for accurate analysis of circuits under the actual effect of aging-induced degradation. We make use of machine learning techniques to overcome infeasible simulation times for individual transistor aging while sustaining high accuracy. In our evaluation on the PULP microprocessor, we achieve predictions of workload-dependent aging-aware standard cells with an average accuracy (R <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> score) of 94.7 %. Using the predicted cell libraries in Static Timing Analysis, timing guardbands are reported with an error of less than 0.1 %. We demonstrate that timing guardband requirements can be reduced by up to 21 % by considering specific workloads over worst-case analysis as performed in state-of-the-art tool flows.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.