Abstract

The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP), information theory, relative entropy and the Kullback–Leibler (KL) divergence, Fisher information and its corresponding geometries. For each of these tools, the precise context of their use is described. The second part of the paper is focused on the ways these tools have been used in data, signal and image processing and in the inverse problems, which arise in different physical sciences and engineering applications. A few examples of the applications are described: entropy in independent components analysis (ICA) and in blind source separation, Fisher information in data model selection, different maximum entropy-based methods in time series spectral estimation and in linear inverse problems and, finally, the Bayesian inference for general inverse problems. Some original materials concerning the approximate Bayesian computation (ABC) and, in particular, the variational Bayesian approximation (VBA) methods are also presented. VBA is used for proposing an alternative Bayesian computational tool to the classical Markov chain Monte Carlo (MCMC) methods. We will also see that VBA englobes joint maximum a posteriori (MAP), as well as the different expectation-maximization (EM) algorithms as particular cases.

Highlights

  • As this paper is an overview and an extension of my tutorial paper in MaxEnt 2014 workshop [1], this

  • We review a few examples of their use in different applications. We demonstrate how these tools can be used in independent components analysis (ICA) and source separation, data model selection, in spectral analysis of the signals and in inverse problems, which arise in many sciences and engineering applications

  • The maximum entropy principle can be used to assign a probability law to a quantity when the available information about it is in the form of a limited number of constraints on that probability law

Read more

Summary

Introduction

As this paper is an overview and an extension of my tutorial paper in MaxEnt 2014 workshop [1], this. This notion of entropy has no direct link with entropy in physics, even if for a particular physical system, we may attribute a probability law to a quantity of interest of that system and define its entropy This information definition of Shannon entropy became the main basis of information theory in many data analyses and the science of communication. We can propose many probability distributions that satisfy the constraint imposed by this problem To answer this question, Jaynes [6,7,8] introduced the maximum entropy principle (MEP) as a tool for assigning a probability law to a quantity on which we have some incomplete or macroscopic (expected values) information. Kullback [9] was interested in comparing two probability laws and introduced a tool to measure the increase of information content of a new probability law with respect to a reference one.

Bayes Rule
Shannon Entropy
Thermodynamical Entropy
Statistical Mechanics Entropy
Boltzmann Entropy
Gibbs Entropy
Relative Entropy or Kullback–Leibler Divergence
Mutual Information
Maximum Entropy Principle
Link between Entropy and Likelihood
Vectorial Variables and Time Indexed Process
10. Entropy in Independent Component Analysis and Source Separation
11. Entropy in Parametric Modeling and Model Selection
12.1. Burg’s Entropy-Based Method
12.2. Extensions to Burg’s Method
12.3. Shore and Johnson Approach
12.4. ME in the Mean Approach
13.1. Linear Inverse Problems
13.2. Entropy-Based Methods
13.3. Maximum Entropy in the Mean Approach
14.1. Simple Bayesian Approach
14.2. Full Bayesian
15. Basic Algorithms of the Variational Bayesian Approximation
15.1. Case of Two Gaussian Variables
15.2. Case of Exponential Families
16. VBA for the Unsupervised Bayesian Approach to Inverse Problems
17. VBA for a Linear Inverse Problem with Simple Gaussian Priors
18. Bayesian Variational Approximation with Hierarchical Prior Models
19. Bayesian Variational Approximation with Student t Priors
20. Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call