Cryptographic devices are vulnerable to side-channel attacks, which have attracted broad attention in the hardware security field. The side-channel analysis framework proposed at Eurocrypt 2009 has been widely adopted in academia, containing key elements such as points of interest, leakage models, distinguishers, and security metrics. This classical framework, however, is not intuitively compatible with recent micro-architectural attacks such as cache attacks, therefore, few attempts have been made to link them. In this paper, we extend the classical side-channel analysis framework and migrate its ideas to access-driven cache attacks. We find that cache attacks can be effectively incorporated into this framework. Specifically, we propose a leakage model called cache access pattern vector according to the cache timing leakage characteristics. Furthermore, we propose data preprocessing schemes such as noise reduction, dimensionality reduction, and vector expansion to implement efficient attacks. Subsequently, we proposed seven distinguishers in three categories: difference of means-based analysis, vector distance-based analysis, and mutual information-based analysis. Moreover, we attacked the last round of the AES fast software implementation algorithm based on the Chipyard platform. According to the experimental results, all proposed methods can successfully extract the key, and five of them are comparable to mainstream cache analysis in attack efficiency. Finally, we evaluate the five efficient distinguishers under three different cache micro-architectures using guessing entropy. Based on the experimental environment of this paper, vector distance-based distinguishers have the best attack efficiency, and the difference of mean-based analysis has the worst. Surprisingly, the most general mutual information-based attacks can achieve excellent medium results in a few attack rounds without the help of static analysis tools. Meanwhile, the experimental results demonstrate that cache micro-architecture directly impacts the attack effect, in which the smaller cache line size benefits the attacks, while the random replacement policy increases the noise and difficulty of the attacks.
Read full abstract