Public-key cryptography is now ubiquitous in our lives, protecting everyday interactions ranging from mobile and wireless communications to banking transactions, over-the-air software updates, electronic voting or digital car keys. However, almost the entirety of currently deployed public-key cryptography implementations rely on the hardness of integer factorisation or computing discrete logarithms, and both problems are known to be easy to solve using large-scale quantum computers. Whether such large-scale computers are years away, decades away or more remains to be seen, but since they would break the security of so many critical systems essentially overnight, preparing the transition from classical cryptography to so-called post-quantum primitives, which are believed to be secure even against quantum computers, is seen as necessary and urgent—particularly as updating existing systems is likely to be a lengthy process, and some systems rely on the security of cryptographic keys over long periods of time. In view of these challenges, starting in 2016, US standards institute NIST has been running a standardisation effort in order to come up with post-quantum encryption and signature schemes ready for deployment. The first four selected primitives have been announced in July 2022, with other schemes undergoing further analysis. Concurrently, other countries, such as the Republic of Korea, have launched similar standardisation processes. One of the difficulties involved in running such standardisation processes and selecting primitives is security estimation: in order to set parameters for candidate schemes and to make apple-to-apple comparisons between them, consistent security levels need to be defined, and evidence that the schemes achieve those levels needs to be provided. This is typically done by estimating the cost of the best attacks (both classical and quantum) against the proposed constructions. Cryptanalysis is thus an essential tool for design and standardisation. Not only does it allow to provide and progressively refine security estimates, it also sometimes eliminates entire schemes, plain and simple, by uncovering serious security flaws. Both applications of cryptanalysis have been extensively represented in the NIST standardisation process in particular. ‘Quantum Algorithms for Attacking Hardness Assumptions in Classical and Post-Quantum Cryptography’, by J.-F. Biasse, X. Bonnetain, E. Martin, E. Kirshanova, A. Schrottenloher and F. Song, on quantum cryptanalysis; ‘Recent Progress in the Security Evaluation of Multivariate Public-Key Cryptography’, by Y. Ikematsu, S. Nakamura and T. Takagi, on multivariate cryptography; ‘Lattice-Based Cryptosystems in Standardization Processes: A Survey’, by A. Wang, D. Xiao and Y. Yu, on lattice-based cryptography; and ‘Torsion point attacks on `SIDH-like' cryptosystems’, by P. Kutas and C. Petit, on isogeny-based cryptography. Since some post-quantum proposals rely on relatively new assumptions that have only received much scrutiny as part of the standardisation processes, their cryptanalysis is a fast-changing landscape. Very recent, dramatic developments include W. Beullens' cryptanalysis of the Rainbow multivariate signature, and W. Castryk and T. Decru's cryptanalysis of the SIKE isogeny-based KEM (along with several follow-ups). Both Rainbow and SIKE were serious contenders for eventual standardisation. Those surprising results, which appeared too late to be captured in this special issue, are thus further testimonies to the utmost importance of cryptanalytic work as part of standardisation efforts. We therefore hope that the contributions included in this special issue will be of great value to the community, insofar as they record, and present in an accessible way, important results towards the goal of obtaining secure and dependable standards for post-quantum cryptography. Data sharing not applicable to this article as no data sets were generated or analysed during the current study. Ayoub Otmani obtained a Ph.D. degree in Mathematics and its Applications from the University of Limoges (France) in 2002. He is currently Professor in Computer Science at University of Rouen Normandie. His research lies in coding theory and cryptography. Christophe Petit obtained a PhD in Cryptography at Université catholique de Louvain. He is now Associate Professor at the University of Birmingham and Université libre de Bruxelles. His research focuses on cryptanalysis and mathematical aspects of cryptography, particularly on isogeny-based cryptography. An alumni of ENS (Paris, France), Mehdi Tibouchi obtained his Ph.D. in computer science from Univ. Paris VII and Univ. Luxembourg in 2011. He is now distinguished researcher at NTT Corporation (Tokyo, Japan) and guest associate professor at Kyoto University (Kyoto, Japan). His research interests cover various mathematical aspects of public-key cryptography and cryptanalysis, particularly related to elliptic curves and Euclidean lattices.
Read full abstract