Abstract

SRAM leakage power dominates the total power of low duty-cycle applications, e.g., sensor nodes. Accordingly, leakage power reduction during data-retention in SRAM standby is often addressed by reducing the supply voltage. Each SRAM cell has a minimum supply voltage parameter called the data-retention voltage (DRV), above which the stored bit can be retained reliably. The DRV exhibits significant intra-chip variation in the deep sub-micron era. As supply voltage is lowered, leakage power reduces, but a larger fraction of SRAM cells is prone to retention failures. Use of appropriate error-correction to mitigate cell- reliability is proposed. Using this approach, the standby supply voltage is selected to minimize leakage power per useful bit. The fundamental limits on the leakage power per useful bit, while taking the DRV distribution into account, are established. Minimization of power per bit results in a supply-voltage at which a small fraction of cells fail to retain the data. For experimental DRV -distributions, a [31,26,3] Hamming code based implementation achieves a significant portion of the leakage power reduction compared to the fundamental limit. These analytical results are verified by twenty-four experimental chips manufactured in an industrial 90 nm CMOS process.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.