Abstract
Entropic priors, recently revisited within the context of theoretical physics, were originally introduced for image processing and for general statistical inference. Entropic priors seem to represent a very promising approach to objective prior determination when such information is not available. The attention has been mostly limited to continuous parameter spaces and our focus in this work is on the application of the entropic prior idea to Bayesian inference with discrete classes in signal processing problems. Unfortunately, it is well known that entropic priors, when applied to sequences, may lead to excessive spreading of the entropy as the number of samples grows. In this paper we show that the spreading of the entropy may be tolerated if the posterior probabilities remain consistent. We derive a condition based on conditional entropies and KL-divergences for posterior consistency using the Asymptotic Equipartition Property (AEP). Furthermore, we show that entropic priors can be modified to force posterior consistency by adding a constraint to joint entropy maximization. Simulations on the application of entropic priors to a coin flipping experiment are included.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.