Abstract
Pseudorandom binary sequences have important uses in many fields, such as spread spectrum communications, statistical sampling and cryptography. There are two kinds of method in evaluating the properties of sequences, one is based on the probability measure, and the other is based on the deterministic complexity measures. However, the relationship between these two methods still remains an interesting open problem. In this paper, we mainly focus on the widely used nonlinear complexity of random sequences, study on its distribution, expectation and variance of memoryless sources. Furthermore, the relationship between nonlinear complexity and Shannon’s entropy is also established here. The results show that the Shannon’s entropy is strictly monotonically decreased with nonlinear complexity.
Highlights
Pseudorandom binary sequences have important uses in many fields, such as error control coding, spread spectrum communications, statistical sampling and cryptography [1,2,3]
Entropy 2015, 17 for generating pseudorandom bit sequences are based on the mid-square method, the linear congruential method, linear and nonlinear feedback shift registers, etc
Beirami et al indicated that the entropy rate plays a key role in the performance and robustness of chaotic map truly random number generators [14], and provided converse and achievable bounds on the binary metric entropy [15], which is the highest rate at which information can be extracted from any given map using the optimal bit-generation function, and et al Besides using the probability measure to evaluate a random sequence, many researchers provided the so-called deterministic complexity measures
Summary
Pseudorandom binary sequences have important uses in many fields, such as error control coding, spread spectrum communications, statistical sampling and cryptography [1,2,3]. Entropy 2015, 17 for generating pseudorandom bit sequences are based on the mid-square method, the linear congruential method, linear and nonlinear feedback shift registers, etc. Beirami et al indicated that the entropy rate plays a key role in the performance and robustness of chaotic map truly random number generators [14], and provided converse and achievable bounds on the binary metric entropy [15], which is the highest rate at which information can be extracted from any given map using the optimal bit-generation function, and et al. Besides using the probability measure to evaluate a random sequence, many researchers provided the so-called deterministic complexity measures.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have