Abstract

The volume of securely encrypted data transmission required by today’s network complexity of people, transactions and interactions increases continuously. To guarantee security of encryption and decryption schemes for exchanging sensitive information, large volumes of true random numbers are required. Here we present a method to exploit the stochastic nature of chemistry by synthesizing DNA strands composed of random nucleotides. We compare three commercial random DNA syntheses giving a measure for robustness and synthesis distribution of nucleotides and show that using DNA for random number generation, we can obtain 7 million GB of randomness from one synthesis run, which can be read out using state-of-the-art sequencing technologies at rates of ca. 300 kB/s. Using the von Neumann algorithm for data compression, we remove bias introduced from human or technological sources and assess randomness using NIST’s statistical test suite.

Highlights

  • The volume of securely encrypted data transmission required by today’s network complexity of people, transactions and interactions increases continuously

  • Methods for identifying global patterns of the microbial component in the biosphere require the synthesis of random nucleotides at specific positions of primers, to assess for hypervariable regions, for example, of the 16S rRNA gene, to allow for taxonomic classification[23,24,25,26,27]

  • After a second compression, bit streams did become random. These results show the robustness of random number generation using DNA synthesis together with de-biasing using the Von Neumann algorithm

Read more

Summary

Introduction

The volume of securely encrypted data transmission required by today’s network complexity of people, transactions and interactions increases continuously. To guarantee security of encryption and decryption schemes for exchanging sensitive information, large volumes of true random numbers are required. Shifting from algorithm to interactions, the modern world required network security services, and introduced encryption and decryption schemes for exchanging information securely, requiring highquality random numbers (generated faster while being less prone to attacks)[4,5]. RNGs, the Intel RNG provides 500 MB/s of throughput Such hardware RNGs create bit streams depending on highly unpredictable physical processes, making them useful for secure data transmission as they are less prone to cryptanalytic attacks[8,9,10,11]. A true RNG uses a non-deterministic (chaotic) source for random number generation[12,13], whereas a pseudo-

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call