Abstract

Information security has become an intrinsic part of data communication. Cryptanalysis using deep learning–based methods to identify weaknesses in ciphers has not been thoroughly studied. Recently, long short-term memory (LSTM) networks have shown promising performance in sequential data processing by modeling the dependencies and data dynamics. Given an encrypted ciphertext sequence and corresponding plaintext, by taking advantage of sequential processing, LSTM can adaptively discover the decryption function regardless of the complexity level, which substantially outperforms traditional methods. However, a lengthy ciphertext sequence causes LSTM to lose important information along the sequence, leading to a decrease in network performance. To tackle these problems, we propose adding an attention mechanism to enhance the LSTM sequential processing power. This paper presents a novel, dynamic way to attack classical ciphers by using an attention-based LSTM encoder-decoder for different ciphertext sequence lengths. The proposed approach takes in a sequence of ciphertext and outputs a sequence of plaintext. The effectiveness and flexibility of the proposed model were evaluated on different classical ciphers. We got close to 100% accuracy in breaking all types of classical ciphers in character-level and word-level attacks. We empirically provide further insights into our results on two datasets with short and long ciphertext lengths. In addition, we provide a performance comparison of the proposed method against state-of-the-art methods. The proposed approach has the potential to attack modern ciphers. To the best of our knowledge, this is the first time an attention-based LSTM encoder-decoder has been applied to attack classical ciphers.

Highlights

  • Security is a major concern in all fields in which information is protected by various encryption methods

  • The Recurrent neural networks (RNNs) is a type of neural network capable of forming a state history of previous input that is suitable for learning algorithmic tasks, and which has become a cornerstone for many natural language processing (NLP) applications

  • EXPERIMENTAL RESULTS we describe multiple experiments that show the efficiency of the attention-based long short-term memory (LSTM) encoder-decoder in attacks on classical ciphers

Read more

Summary

INTRODUCTION

Security is a major concern in all fields in which information is protected by various encryption methods. We present a new approach to classical cipher attacks by using an attention-based LSTM encoderdecoder model. The sequence-to-sequence decryption ability of the proposed method can preserve the sequential nature of the language model (LM) and the alignment between the input and output sequence, which is an important step in conducting an efficient attack on different ciphers. Experiments, and the results evaluated the effectiveness of our proposed approach, 4) we conducted our experiments on two different datasets with short and long ciphertext sequence lengths, and 5) we conducted attacks at both the character level and the word level.

RECURRENT NEURAL NETWORKS
LSTM ENCODER-DECODER MODEL
19. Output
EXPERIMENTAL RESULTS
SUBSTITUTION CIPHER BREAKING
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.