Abstract

Abstract Non-functional properties, like execution time or memory access information, of programs running on graphics processing unit (GPUs) can raise safety and security concerns. For example, understanding the execution time is critical for embedded and real-time applications. To this end, worst-case execution time (WCET) is an important metric to check the real-time constraints imposed on embedded applications. For complex execution platforms, such as GPUs, analysis of WCET imposes great challenges due to the complex characteristics of GPU architecture as well as GPU program semantics. GPUs also have specific memory access behavior. Observing such memory access behavior may reveal sensitive information (e.g. a secret key). This, in turn, may be exploited to launch a side-channel attack on the underlying program. In this paper, we propose GDivAn , a measurement-based analysis framework for investigating the non-functional aspects of GPU programs, specifically, their execution time and side-channel leakage capacity. GDivAn is built upon a novel instantiation of genetic algorithm (GA). Moreover, GDivAn improves the effectiveness of GA using symbolic execution, when possible. Our evaluation with several open-source GPU kernels, including GPU kernels from the OpenSSL and MRTC benchmark suite, reveals the effectiveness of GDivAn both in terms of finding WCET and side-channel leakage.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call