Abstract

Jeffrey’s divergence (JD), the symmetric Kullback-Leibler (KL) divergence, has been used in a wide range of applications. In recent works, it was shown that the JD between probability density functions of k successive samples of autoregressive (AR) and/or moving average (MA) processes can tend to a stationary regime when the number k of variates increases. The asymptotic JD increment, which is the difference between two JDs computed for k and \(k-1\) successive variates tending to a finite constant value when k increases, can hence be useful to compare the random processes. However, interpreting the value of the asymptotic JD increment is not an easy task as it depends on too many parameters, i.e. the AR/MA parameters and the driving-process variances. In this paper, we propose to compute the asymptotic JD increment between the processes that have been normalized so that their powers are equal to 1. Analyzing the resulting JD on the one hand and the ratio between the original signal powers on the other hand makes the interpretation easier. Examples are provided to illustrate the relevance of this way to operate with the JD.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call