Abstract

Comparing processes or models is of interest in various applications. Among the existing approaches, one of the most popular methods is to use the Kullback-Leibler (KL) divergence which is related to Shannon’s entropy. Similarly, the Renyi divergence of order α can be deduced from the Renyi entropy. When α tends to 1, it leads to the KL divergence. In this paper, our purpose is to derive the expression of the Renyi divergence between the probability density functions of k consecutive samples of two real first-order moving average (MA) processes by using the eigen-decompositions of their Toeplitz correlation matrices. The resulting expression is compared with the expressions of the Rao distance and the Jeffrey’s divergence (JD) based on the eigenvalues. The way these quantities evolve when k increases is then presented. When dealing with unit-zero MA processes, the derivate is infinite for the JD and finite for the others. The influence of α is also studied.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.