Abstract

To cope with the increasing scale of scientific data and computational complexity of daily data, more and more cores have been integrated into GPU(Graphic Processing Units) and its working frequency is continually upgrading, which makes it being widely used in general computing for assisting CPU to accelerate program. While GPU offers powerful computing capability, the problem of the energy consumption becomes particularly prominently and it has become one of the important issues hindering development of GPU. For the purpose of solving this problem, DVFS (Dynamic Voltage Frequency Scaling) becomes an effective solution. Because the previous works only focus on single component and use linear relationship to do DVFS without considering energy saving of other units in system at software runtime, therefore we propose an energy saving model (CDVFS) of considering the characteristics of both GPU and memory at software runtime based on GA-BP (Genetic Algorithm-Back propagation) neural network to make better use of the relationship between components for energy saving. Firstly, the model assumes that functional relation between the software runtime characteristics of GPU and memory and the appropriate frequency which corresponds to the GPU and memory as nonlinear. Secondly, we extract five characteristics and use GA-BP neural network to fit the nonlinear functional relation. At last, experiments demonstrate the effectiveness of the approach and reasonableness of assumption, and also show that CDVFS can get average energy savings of 17.06% compared with previous works within acceptable performance loss.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call