In this paper, we demonstrate the effects of CMOS technology scaling on the high temperature characteristics (from 25/spl deg/C to 125/spl deg/C) of the four components of off-state drain leakage (I/sub off/) (i.e. subthreshold leakage (I/sub sub/), gate edge-direct-tunneling leakage (I/sub EDT/), gate-induced drain-leakage (I/sub GIDL/), and bulk band-to-band-tunneling leakage (I/sub B-BTBT/)). In addition, the high temperature characteristics of I/sub off/ with reverse body bias (V/sub B/) for the further reduction of the standby leakage are also demonstrated. The discussion is based on the data measured from three CMOS logic technologies (i.e., low-voltage and high performance (LV), low-power (LP), and ultra-low-power (ULP)) and three generations (0.18 /spl mu/m, 0.15 /spl mu/m, and 0.13 /spl mu/m). Experiments show that the optimum V/sub B/, which minimizes I/sub off/, is a function of temperature. The experiments also show that for CMOS logic technologies of the next generations, it is important to control I/sub B-BTBT/ and I/sub GIDL/ by reducing effective doping concentration and doping gradient. It seems that in order to conform on-state gate leakage (I/sub G-on/) and I/sub EDT/ specifications and to retain a 10-20% performance improvement at the same time, it is indispensable to use high-quality and high-dielectric-constant materials to reduce effective oxide thickness (EOT). The role of each leakage component in SRAM standby current (I/sub SB/) is also analyzed.