Abstract
It is well known that quantization cannot increase the Kullback-Leibler divergence which can be thought of as the expected value or first moment of the log-likelihood ratio. In this paper, we investigate the quantization effects on the second moment of the log-likelihood ratio. It is shown that quantization may result in an increase in the case of the second moment, but the increase is bounded above by 2/e. The result is then applied to decentralized sequential detection problems to provide a simpler sufficient condition for asymptotic optimality theory, and the technique is also extended to investigate the quantization effects on other higher-order moments of the log-likelihood ratio and provide lower bounds on higher-order moments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.