Abstract
Considering symbolic and numerical random sequences in the framework of the additive Markov chain approach, we establish a relation between their correlation functions and conditional entropies. We express the entropy by means of the two-point probability distribution functions and then evaluate the entropy for the numerical random chain in terms of the correlation function. We show that such approximation gives a satisfactory result only for special types of random sequences. In general case the conditional entropy of numerical sequences obtained in the two-point distribution function approach is lower. We derive the conditional entropy of the additive Markov chain as a sum of the Kullback-Leibler mutual information and give an example of random sequence with the exactly zero correlation function and the nonzero correlations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have