Abstract
This paper proposes a Bayesian Cramér-Rao type lower bound on the minimum mean square error. The key idea is to minimize the latter subject to the constraint that the joint distribution of the input-output statistics lies in a Kullback–Leibler divergence ball centered at a Gaussian reference distribution. The bound is tight and is attained by a Gaussian distribution whose mean is identical to that of the reference distribution and whose covariance matrix is determined by a scalar parameter that can be obtained by finding the unique root of a simple function. Examples of applications in signal processing and information theory illustrate the usefulness of the proposed bound in practice.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.