Abstract

The purpose of this paper is first to derive the expressions of various divergences that can be expressed from the Chernoff coefficient in order to compare two probability density functions of vectors storing k consecutive samples of a sum of complex exponentials disturbed by an additive white noise. This includes the Chernoff divergence and the α-divergence for instance. Tsallis, reversed Tsallis and Sharma-Mittal divergences are also addressed as well as the β-, γ- and αγ-divergences. The behaviors of the divergences are studied when k increases and tends to infinity. Depending on the divergence used, the divergence rate or the asymptotic normalized increment is considered. Expressions that encompass the divergence rate or the asymptotic normalized increment of the divergences are also given. Comments and illustrations to compare random processes are then given. This study makes it possible to show the advantages of the Kullback-Leibler divergence when studying this type of process.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.