Abstract

With the continuous scaling of CMOS technology, microelectronic circuits are increasingly susceptible to micro-electronic variations such as variations in operating conditions. Such variations can cause delay uncertainty in microelectronic circuits, leading to timing errors. Circuit designers typically combat these errors using conservative guardbands in the circuit and architectural design, which can, however, cause significant loss of operational efficiency. In this paper, we propose TEVoT, a supervised learning model that can predict the timing errors of functional units (FUs) under different operating conditions, clock speeds, and input workload. We perform dynamic timing analysis to characterize the delay variations of FUs under different conditions, based on which we collect training data. We then extract useful features from training data and apply supervised learning methods to establish TEVoT. Across 100 different operating conditions, 4 widely-used FUs, 3 clocking speeds, and 3 datasets, TEVoT achieves an average prediction accuracy at 98.25% and is 100X faster than gate-level simulation. We further use TEVoT to estimate application output quality under different operating conditions by exposing circuit-level timing errors to application level. TEVoT achieves an average estimation accuracy at 97% for two image processing applications across 100 operating conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call