Voltage droop is a major reliability concern in nano-scale very large-scale integration designs. Undesirable voltage droop is often a result of excessive IR drop. On the other hand, Ldi/dt -induced droop occurs when logic gates in the circuit draw high-switching current from the on-chip power supply network, and this problem is exacerbated at high-clock frequencies and smaller technology nodes. A consequence of voltage droop is usually an increase in path delays and the occurrence of intermittent faults during circuit operation. The addition of conservative timing margins, also known as guardbands, is a common practice to tackle the problem of voltage droop. However, such static and pessimistic guardbands, which are calculated at design time based on worst-case conditions, lead to significant performance loss. Dynamic frequency scaling is an alternative approach that enables the dynamic adjustment of clock frequency based on the actual voltage droop seen during runtime. For dynamic voltage-frequency to be effective, accurate and real-time prediction of voltage droop is essential. We propose a support-vector machine (SVM)-based regression method to predict voltage droop due to pattern-dependent IR drop based on inputs to the chip at runtime. Moreover, we reduce the amount of data needed for accurate prediction by using correlation-based feature selection. Several benchmarks from ITC’99 and International Work on Logic and Synthesis’05 highlight the effectiveness of the proposed method in terms of delay-prediction accuracy. Since real-time droop prediction requires hardware implementation of the predictor, we present the hardware design and synthesis results to demonstrate that the hardware overhead for the SVM predictor is negligible for large circuits.