- New
- Research Article
- 10.3390/cryptography10020027
- Apr 20, 2026
- Cryptography
- Maksim Iavich + 2 more
Polynomial commitment schemes (PCS) enable a prover to commit to a polynomial and later reveal evaluations with succinct, verifiable proofs. As critical components of modern cryptographic systems like Verkle trees and zk-SNARKs, these methods are experiencing a significant transition from classical to post-quantum designs. This comprehensive research systematically compares the major scheme families to examine this progression, from pairing-based KZG and transparent Bulletproofs to lattice-based and hash-based post-quantum alternatives. We present a unified taxonomy that maps the classical-to-post-quantum transition across trust models, security assumptions, and efficiency measures after conducting a PRISMA-guided systematic review of 77 works. Our analysis reveals a fundamental trade-off between efficiency and security: classical schemes, which rely on quantum-vulnerable assumptions, provide optimal performance with constant-sized proofs, while post-quantum alternatives offer quantum resistance at the cost of larger proofs and higher computational overhead. By combining research works, we highlight recurrent problems with adaptive security, verification efficiency, and proof conciseness. We offer a specific research roadmap with prioritized short-, medium-, and long-term directions to close the performance gap between quantum-resistant and classical architectures based on our quantitative analysis. This study offers a technical reference and a strategic roadmap for constructing practical post-quantum polynomial commitments.
- New
- Research Article
- 10.3390/cryptography10020026
- Apr 14, 2026
- Cryptography
- Alessandro Caniglia + 4 more
Classical approaches to cryptography exhibit several limitations when applied to scenarios involving more than two users. The One-Time User Key (OTUK) meta-cryptographic model addresses these limitations by enabling multi-user encryption that is flexible, applicable to any cryptographic algorithm, and designed for systematic deployment without compromising system security. Each user possesses an individual key from which One-Time keys are derived; these keys feed a secret-sharing function (ω) that establishes the multi-user encrypted channel. In this paper, we present a polynomial-based implementation of the ω function under a (1,n) threshold model. The generated polynomial has roots at points corresponding to valid user keys and is mapped to the real encryption key. We provide a formal threat model, pseudocode for the complete protocol, and a detailed computational analysis across the numerical domains N, Z, and R. Furthermore, we present experimental benchmarks measuring encryption/decryption speed, scalability up to 30 users, parameter sensitivity, and a comparative evaluation against Shamir’s Secret Sharing scheme. A systematic security analysis examines partial-information attacks, derivative-root distance margins, and brute-force resistance, demonstrating that the effective security margin remains above 245 bits for configurations of up to 30 users with 256-bit keys. The proposed method offers a concrete, efficient, and secure foundation for multi-user encrypted communication in domains such as IoT, public administration, and e-health.
- New
- Research Article
- 10.3390/cryptography10020021
- Mar 24, 2026
- Cryptography
- Yuqun Lin + 7 more
Fully homomorphic encryption (FHE) offers a promising solution for privacy-preserving machine learning by enabling arbitrary computations on encrypted data. However, the efficient evaluation of non-linear functions—such as the ReLU activation function over large integers—remains a major obstacle in practical deployments, primarily due to high bootstrapping overhead and limited precision support in existing schemes. In this paper, we propose LargeIntReLU, a novel framework that enables efficient homomorphic ReLU evaluation over large integers (7–11 bits) via full-domain bootstrapping. Central to our approach is a signed digit decomposition algorithm, SignedDecomp, that partitions a large integer ciphertext into signed 6-bit segments using three new low-level primitives: LeftShift, HomMod, and CipherClean. This decomposition preserves arithmetic consistency, avoids cross-segment carry propagation, and allows parallelized bootstrapping. By segmenting the large integer and processing each chunk independently with optimized small-integer bootstrapping, we achieve homomorphic ReLU with full-domain bootstrapping, which significantly reduces the total number of sequential bootstrapping operations required. The security of our scheme is guaranteed by TFHE. Experimental results demonstrate that the proposed method reduces the bootstrapping cost by an average of 28.58% compared to state-of-the-art approaches while maintaining 95.2% accuracy. With execution times ranging from 1.16 s to 1.62 s across 7–11 bit integers, our work bridges a critical gap toward a scalable and efficient homomorphic ReLU function, which is useful in privacy-preserving machine learning. Furthermore, an end-to-end encrypted inference test on a CNN model with the MNIST dataset confirms its practicality, achieving 88.85% accuracy and demonstrating a complete pipeline for privacy-preserving neural network evaluation.
- Research Article
- 10.3390/cryptography10020015
- Feb 26, 2026
- Cryptography
- Mohammad Alkhatib
Deepfake technology can produce highly realistic manipulated media which pose as significant cybersecurity threats, including fraud, misinformation, and privacy violations. This research proposes a deepfake prevention approach based on symmetric and asymmetric ciphers. Post-quantum asymmetric ciphers were utilized to perform digital signature operations, which offer essential security services, including integrity, authentication, and non-repudiation. Symmetric ciphers were also employed to provide confidentiality and authentication. Unlike classical ciphers that are vulnerable to quantum attacks, this study adopts quantum-resilient ciphers to offer long-term security. The proposed approach enables entities to digitally sign media content before public release on other platforms. End users can subsequently verify the authenticity of content using the public keys of the media creators. To identify the most efficient ciphers to perform cryptography operations required for deepfake prevention, the study explores the implementation of quantum-resilient symmetric and asymmetric ciphers standardized by NIST, including Dilithium, Falcon, SPHINCS+, and Ascon-80pq. Additionally, this research provides comprehensive comparisons between the various classical and post-quantum ciphers in both categories: symmetric and asymmetric. Experimental results revealed that Dilithium-5 and Falcon-512 algorithms outperform other post-quantum ciphers, with a time delay of 2.50 and 251 ms, respectively, for digital signature operations. The Falcon-512 algorithm also demonstrates superior resource efficiency, making it a cost-effective choice for digital signature operations. With respect to symmetric ciphers, Ascon-80pq achieved the lowest time consumption, taking just 0.015 ms to perform encryption and decryption operations. Also, it is a significant option for constrained devices, since it consumes fewer resources compared to standard symmetric ciphers, such as AES. Through comprehensive evaluations and comparisons of various symmetric and asymmetric ciphers, this study serves as a blueprint to identify the most efficient ciphers to perform the cryptography operations necessary for deepfake prevention.
- Research Article
- 10.3390/cryptography10010010
- Feb 12, 2026
- Cryptography
- Chuanming Zong
In 1994, P. Shor discovered quantum algorithms that can break both the RSA cryptosystem and the ElGamal cryptosystem. In 2007, D-Wave demonstrated the first quantum computer. These events and further developments have brought a crisis to secret communication. In 2016, the National Institute of Standards and Technology (NIST) launched a global project to solicit and select a handful of encryption algorithms with the ability to resist quantum computer attacks. In 2022, it announced four candidates, CRYSTALS-Kyber, CRYSTALS-Dilithium, Falcon, and Sphincs+, for post-quantum cryptography standards. The first three are based on lattice theory and the last on a hash function. The security of lattice-based cryptosystems relies on the computational complexity of the shortest vector problem (SVP), the closest vector problem (CVP), and their generalizations. As we will explain, the SVP is a ball-packing problem, and the CVP is a ball-covering problem. Furthermore, both the SVP and CVP are equivalent to arithmetic problems for positive definite quadratic forms. This paper will briefly describe the mathematical problems on which lattice-based cryptography is built so that cryptographers can extend their views and learn something useful.
- Research Article
- 10.3390/cryptography10010012
- Feb 12, 2026
- Cryptography
- Sang-Yoon Chang + 1 more
Post-quantum cryptography (PQC) provides the essential cryptographic algorithms needed to secure digital networking systems against future adversaries equipped with quantum computing. This paper reviews the PQC research landscape and identifies open challenges and future directions for the critical transition to PQC in digital networking systems. Building on the NIST standardization process which has hardened the PQC cipher algorithm security, this paper analyzes and describes the recent research on PQC implementations and integrations into scalable and standardized networking systems (Internet, web and cellular networks). We review research on the security, side-channel threats, performances, overheads, and compatibility of PQC ciphers. We also study the research incorporating PQC into the standardized web and cellular networking protocols, ranging from testing the PQC feasibility to proposing protocol solutions and mechanisms to enable PQC. Our study highlights the PQC challenge of large parameter sizes, common across the PQC cipher algorithms, and the research proposing protocol- and system-level mechanisms to address them. Informed by the survey, this paper identifies and highlights the research gaps and future directions to facilitate further research and development for PQC and to secure next-generation digital networking systems.
- Research Article
- 10.3390/cryptography10010011
- Feb 12, 2026
- Cryptography
- Maya Thabet + 3 more
Post-quantum cryptography (PQC) is, and should be, currently dominating the field of cybersecurity, with many works designing and evaluating the transition of communications security to quantum-safe solutions. As the security level and implementations of post-quantum algorithms become more mature, the research on their application to realistic conditions changes accordingly, especially their application to widely adopted network architectures and corresponding protocols such as the Public Key Infrastructure (PKI). In this survey, we identified articles presenting ways of integrating PQC algorithms to PKI and classified related work according to the employed methods and benchmarking choices. The main results from many evaluations converge to similar conclusions on the performance of the most popular PC digital signature algorithms; however, modeling choices concerning architecture variants, hardware and measurement metrics vary. The diversity of the results and experimental setups makes comparison difficult and arrival at an objective conclusion regarding PKI requirements almost impossible. Ultimately, this review reveals a fragmented landscape of benchmarking practices for post-quantum PKI systems. The absence of standardized evaluation frameworks and common test environments limits the comparability and reproducibility of the findings. We aim to provide reference implementations, which are essential to guide the transition of PKI infrastructures toward robust, scalable, and quantum-resistant deployments.
- Research Article
- 10.3390/cryptography10010009
- Feb 10, 2026
- Cryptography
- Jimmy Dani + 2 more
Indistinguishability is a fundamental principle of cryptographic security, crucial for securing data transmitted between Internet of Things (IoT) devices. This principle ensures that an attacker cannot distinguish between the encrypted data, also known as ciphertext, and random data or the ciphertexts of two messages encrypted with the same key. This research investigates the ability of machine learning (ML) to assess the indistinguishability property in encryption systems, with a focus on lightweight ciphers. As our first case study, we consider the SPECK32/64 and SIMON32/64 lightweight block ciphers, designed for IoT devices operating under significant energy constraints. In this research, we introduce MIND-Crypt (a Machine-learning-based framework for assessing the INDistinguishability of Cryptographic algorithms), a novel ML-based framework designed to assess the cryptographic indistinguishability of lightweight block ciphers, specifically the SPECK32/64 and SIMON32/64 encryption algorithms in CBC, CFB, OFB, and CTR modes, under Known Plaintext Attacks (KPAs). Our approach involves training ML models using ciphertexts from two plaintext messages encrypted with the same key to determine whether ML algorithms can identify meaningful cryptographic patterns or leakage. Our experiments show that modern ML techniques consistently achieve accuracy equivalent to random guessing, indicating that no statistically exploitable patterns exist in the ciphertexts generated by the considered lightweight block ciphers. Although some models exhibit mode-dependent bias (e.g., collapsing to a single-class prediction in CBC and CFB), their overall accuracy remains at random guessing levels, reinforcing that no meaningful distinguishing patterns are learned. Furthermore, we demonstrate that, when ML algorithms are trained on all possible combinations of ciphertexts for given plaintext messages, their behavior reflects memorization rather than generalization to unseen ciphertexts. Collectively, these findings suggest that existing block ciphers have secure cryptographic designs against ML-based indistinguishability assessments, reinforcing their security even under round-reduced conditions.
- Research Article
- 10.3390/cryptography10010008
- Jan 27, 2026
- Cryptography
- Adrian Donatien-Charon + 4 more
This article presents general methodologies for plaintext attacks on block ciphers using the Tabu Search algorithm. These methods treat the cipher as a black box, with the objective of finding the session key. The primary innovation of our approach is the division of the key space into subsets based on a divisor, enabling the attack to focus on a specific portion of the total space. The following investigation demonstrates the successful application of these methods to a member of a block cipher family that includes the Advanced Encryption Standard (AES) cipher. One of the proposed methodologies, the subregions path attack, enables navigation of the key session space by applying specific predetermined strategies within these subregions.
- Research Article
- 10.3390/cryptography10010007
- Jan 18, 2026
- Cryptography
- Daniel Alarcón-Narváez + 2 more
We present an algebraic framework for constructing challenge–response authentication protocols based on powers of non-diagonalizable matrices over finite fields. The construction relies on upper triangular Toeplitz matrices with a single Jordan block and on their structured power expansions, which induce nonlinear relations between matrix parameters and exponents through an autopotency phenomenon. The protocol is built from a cyclic family of matrix products derived from secret matrices (Ai)i=1n⊂GLk(Fp): for each index i, a product Pi=AiAi+1…Ai+n−1 is formed (indices modulo n), and its power Pi(x) is published for a secret exponent x. The resulting family of powered products is linked by conjugation via the unknown factors Ai, enabling an interactive authentication mechanism in which the prover demonstrates the knowledge of selected factors by satisfying explicit conjugacy relations. We formalize the underlying algebraic problems in terms of factor recovery and conjugacy identification from powered products, and analyze how the enforced non-diagonalizable structure and Toeplitz constraints lead to coupled multivariate polynomial systems. These systems arise naturally from the algebraic design of the construction and do not admit immediate reductions to classical discrete logarithm settings. The framework illustrates how non-diagonalizable matrix structures and structured conjugacy relations can be used to define concrete authentication primitives in noncommutative algebraic settings, and provides a basis for further cryptanalytic and cryptographic investigation.