Abstract

One of the main security requirements for symmetric-key block ciphers is resistance against differential cryptanalysis. This is commonly assessed by counting the number of active substitution boxes (S-boxes) using search algorithms or mathematical solvers that incur high computational costs. These costs increase exponentially with respect to block cipher size and rounds, quickly becoming inhibitive. Conventional S-box enumeration methods also require niche cryptographic knowledge to perform. In this paper, we overcome these problems by proposing a data-driven approach using deep neural networks to predict the number of active S-boxes. Our approach trades off exactness for real-time efficiency as the bulk of computational work is brought over to pre-processing (training). Active S-box prediction is framed as a regression task whereby neural networks are trained using features such as input and output differences, number of rounds, and permutation pattern. We first investigate the feasibility of the proposed approach by applying it on a reduced (4-branch) generalized Feistel structure (GFS) cipher. Apart from optimizing a neural network architecture for the task, we also explore the impact of each feature and its representation on prediction error. We then extend the idea to 64-bit GFS ciphers by first training neural networks using data from five different ciphers before using them to predict the number of active S-boxes for TWINE, a lightweight block cipher. The best performing model achieved the lowest root mean square error of 1.62 and R <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> of 0.87, depicting the feasibility of the proposed approach.

Highlights

  • B LOCK ciphers are ubiquitous cryptographic primitives that provide data confidentiality but are used as building blocks for myriad other cryptographic algorithms and protocols

  • We train deep neural network models to predict the number of active substitution boxes (S-boxes) for lightweight block ciphers based on general block cipher features and differential data

  • Our results show that the permutation pattern has the biggest impact on prediction performance as its removal leads to the largest increase in prediction errors (88.8%), followed by input differences (23%) and output differences (15.8%)

Read more

Summary

Introduction

B LOCK ciphers are ubiquitous cryptographic primitives that provide data confidentiality but are used as building blocks for myriad other cryptographic algorithms and protocols. A block cipher processes a fixed-length (block) of data using a key-dependent transformation that usually consists of generic operations such as substitution and permutation. Keys for each encryption round, known as round keys, are derived from a master key using a key scheduling algorithm Depending on their underlying structure, modern block ciphers can be classified into different categories such as the generalized Feistel structure (GFS), addition-XOR-rotate (ARX), and substitution-permutation network (SPN). Cryptanalysts have attempted to use machine learning in a straightforward manner - performing decryption of ciphertexts without knowledge of the secret key This is equivalent to training a machine learning model to emulate or mimic an encryption algorithm for a fixed secret key. For this purpose, [18] utilized unsupervised learning with neural networks to attack classical ciphers such as the Vigenere and Shift ciphers

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call