Abstract
Two existing methods of probabilistic inference are based on variational principles: maximum entropy and minimum Fisher information. In each case, a probability density function is inferred by setting the first variation of a functional to zero, subject to information constraints. This study considers whether other functionals could be used for this purpose, and by starting with requirements for self-consistency and invariance, it is shown that the most general admissible functional is just a linear combination of entropy and Fisher information, with the proviso that the normal definition of Fisher information is modified by the inclusion of a prior. This amounts to an axiomatic derivation of entropy and Fisher information. The concern is with continuous random variables and both the single- and multivariable cases are considered. A number of examples are considered to compare inference based on entropy with that based on Fisher information, and to highlight the role of boundary conditions for inference based on Fisher information.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.