Abstract

Pseudorandom bit sequences are generated using deterministic algorithms to simulate truly random sequences. Many cryptographic algorithms use pseudorandom sequences, and the randomness of these sequences greatly impacts the robustness of these algorithms. Important crypto primitive Linear Feedback Shift Register (LFSR) and its combinations have long been used in stream ciphers for the generation of pseudorandom bit sequences. The sequences generated by LFSR can be predicted using the traditional Berlekamp Massey Algorithm, which solves LFSR in 2×n number of bits, where n is the degree of LFSR. Many different techniques based on ML classifiers have been successful at predicting the next bit of the sequences generated by LFSR. However, the main limitation in the existing approaches is that they require a large number (as compared to the degree of LFSR) of bits to solve the LFSR. In this paper, we have proposed a novel Pattern Duplication technique that exponentially reduces the input bits requirement for training the ML Model. This Pattern Duplication technique generates new samples from the available data using two properties of the XOR function used in LFSRs. We have used the Deep Neural Networks (DNN) as the next bit predictor of the sequences generated by LFSR along with the Pattern Duplication technique. Due to the Pattern Duplication technique, we need a very small number of input patterns for DNN. Moreover, in some cases, the DNN model managed to predict LFSRs in less than 2n bits as compared to the Berlekamp Massey Algorithm. However, this technique was not successful in cases where LFSRs have primitive polynomials with a higher number of tap points.

Highlights

  • We use many randomized algorithms which require pseudorandom data for their working

  • The results show that the minimum number of bits required by the Deep Neural Networks (DNN) is at least 7% lesser than the number of bits required by BM Algorithm and this percentage increases with the increase of degree of the Linear Feedback Shift Register (LFSR) and reaches up to 45% for LFSR of degree 100

  • Frame length is recorded at this instance and the model is retrained by fixing this frame length with repeatedly reducing the number of bits generated by the LFSR so that we achieve around 100% validation and testing accuracy

Read more

Summary

Introduction

Pseudorandom sequences are not truly random and can be regenerated again with a seed Cryptographic algorithms use these sequences for their working by assuming their unpredictable property. Hernández et al [2000, 2001] converted the prediction problem to classification problem and used decision tree classifier C4.5 to predict pseudorandom sequences generated from LFSR. Their objective was to use ML based bit prediction models as evaluation criteria. Use of four machine learning algorithms was made; namely, decision tree based C4.5 algorithm, Naive Bayes, Averaged One Dependence Estimators (AODE) and Multi-Layer Perceptron for the bit prediction problem. The results confirm that these networks can predict the entire sequence knowing fewer input patterns than the existing reported techniques based on other classifiers

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call