Abstract

In static random access memory, standby power is the summation of scrubbing and leakage powers. In terrestrial environments, the leakage power is more dominant than the scrubbing power. Hence, the conventional methodology to reduce SRAM standby power is to lower ${V_{DD}}$ to possible minimum voltage under process, voltage, and temperature variations. However, under severe radiation environments such as space, high scrubbing rate is indispensable to prevent the accumulation of soft-errors, making the scrubbing power have a substantial portion of total standby power. Since the soft-error rate becomes higher with the ${V_{DD}}$ scaling, the conventional methodology may not be valid under radiation environments. We present a methodology to decide optimal supply voltage with respect to standby power under radiation. We visualize our methodology under solar max/min galactic cosmic ray radiation environment of geosynchronous earth orbit and three error correction code (ECC) scenarios: Hamming code, double-error-correction (DEC) Bose–Chaudhuri–Hocquenghem (BCH) code, and triple-error-correction (TEC) BCH code. In 65 nm CMOS, Hamming code fails to deliver our target decoded bit-error-rate. Under other ECCs, the proposed methodology shows that 0.97 V (for DEC BCH) and 0.8 V (for TEC BCH) are optimal. Here, we can obtain 30% (for DEC BCH) and 60% (for TEC BCH) standby power savings compared to nominal voltage (= 1.2 V), respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call