In an integrated circuit (IC) adopting a power-gating (PG) technique, the virtual supply voltage (VVDD) is susceptible to: 1) negative-bias temperature instability (NBTI) degradation that weakens the PG device over time and 2) temporal temperature variation that affects active leakage current (thus total current) of the IC. The PG device is sized to guarantee a minimum VVDD level over the chip lifetime. Thus, the NBTI degradation and the worst-case total current at high-temperature must be considered for sizing the PG device. This leads to higher VVDD (thus active leakage power) than necessary in early chip lifetime and/or at low temperature, negatively impacting the gate-oxide reliability of transistors. To reduce active leakage power increase and improve the gate-oxide reliability due to these effects, we propose two techniques that adjust the strength of a PG device based on its usage and IC's temperature at runtime. We demonstrate the efficacy of these techniques with an experimental setup using a 32-nm technology model in the presence of within-die spatial process and temperature variations. On an average of 100 die samples, they can reduce dynamic and active leakage power by up to 3.7% and 10% in early chip lifetime. Finally, these techniques also reduce the oxide failure rate by up to 5% across process corners over a period of 7 years.