Abstract
The Poisson distribution arises naturally when dealing with data involving counts, and it has found many applications in inverse problems and imaging. In this work, we develop an approximate Bayesian inference technique based on expectation propagation for approximating the posterior distribution formed from the Poisson likelihood function and a Laplace type prior distribution, e.g. the anisotropic total variation prior. The approach iteratively yields a Gaussian approximation, and at each iteration, it updates the Gaussian approximation to one factor of the posterior distribution by moment matching. We derive explicit update formulas in terms of one-dimensional integrals, and also discuss stable and efficient quadrature rules for evaluating these integrals. The method is showcased on two-dimensional PET images.
Highlights
The Poisson distribution is widely employed to describe inverse and imaging problems involving count data, e.g. emission computed tomography [40, 44], including positron emission tomography and single photon emission computed tomography
We develop an approximate Bayesian inference technique based on expectation propagation for approximating the posterior distribution formed from the Poisson likelihood function and a Laplace type prior distribution, e.g. the anisotropic total variation prior
It is worth noting that the Poisson model is especially important in the low-count regime, e.g. [0, 10] photons, whereas in the moderate count regime, heteroscedastic normal approximations can be employed in the reconstruction, leading to a weighted Gaussian likelihood function
Summary
The Poisson distribution is widely employed to describe inverse and imaging problems involving count data, e.g. emission computed tomography [40, 44], including positron emission tomography and single photon emission computed tomography. For imaging problems with Poisson data, a full Bayesian treatment is challenging, due to the nonnegativity constraint and high-dimensionality of the parameter/data space. We develop a computational strategy for exploring the posterior distribution for Poisson data (with two popular nonnegativity constraints) with a Laplace type prior based on expectation propagation [33, 34], in order to deliver a Gaussian approximation. The work [28] discussed a full Bayesian exploration with EP, by modifying the posterior distributions using a rectified linear function on the transformed domain of the signal, which induces singular measures on the region violating the constraint. We describe two useful parameterizations of a Gaussian distribution, Laplace approximation and additional comparative numerical results for a one-dimensional problem with MCMC and Laplace approximation to shed further insights into the performance of EP algorithms
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have