Abstract

Like other divergences, Jeffrey's divergence (JD) is used for change detection, for model comparison, etc. Recently, a great deal of interest has been paid to this symmetric version of the Kullback-Leibler (KL) divergence. This led to analytical expressions of the JD between autoregressive (AR) processes, moving-average (MA) processes, either noise-free or disturbed by additive white noises, as well as ARMA processes. In this paper, we propose to study the JD between processes that are defined as sums of complex-valued sinusoidal processes disturbed by additive white noises. We show that the JD tends to a stationary behavior when the number of variates becomes large. The derivative of the JD becomes a constant that depends on the parameters defining the processes. The convergence speed towards this stationary regime depends on the differences between the normalized angular frequencies. The smaller the difference, the slower the convergence. This result can be obtained by interpreting some steps to compute the JD as orthogonal projections. Some examples illustrate the theoretical analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call