Abstract

Even with a powerful hardware in parallel execution, it is still difficult to improve the application performance and reduce energy consumption without realizing the performance bottlenecks of parallel programs on GPU architectures. To help programmers have a better insight into the performance and energy-saving bottleneck of parallel applications on GPU architectures, we propose two models: an execution time prediction model and an energy consumption prediction model. The execution time prediction model(ETPM) can estimate the execution time of massively parallel programs which take the instruction-level and thread-level parallelism into consideration. ETPM contains two components: memory sub-model and computation sub-model. The memory sub-model is estimating the cost of memory instructions by considering the number of active threads and GPU memory bandwidth. Correspondingly, the computation sub-model is estimating the cost of computation instructions by considering the number of active threads and the application's arithmetic intensity. We use ocelot to analysis PTX codes to obtain several input parameters for the two sub-models such as the memory transaction number and data size. Basing on the two sub-models, the analytical model can estimates the cost of each instruction while considering instruction-level and thread-level parallelism, thereby estimating the overall execution time of an application. The energy consumption prediction model(ECPM) can estimate the total energy consumption basing on the data from ETPM. We compare the outcome from the models and the actual execution on GTX260 and Tesla C2050. The results show that the models can reach almost 90 percentage accuracy in average for the benchmarks we used.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.