Abstract

We propose a globally convergent alternating minimization (AM) algorithm for image reconstruction in transmission tomography, which extends automatic relevance determination (ARD) to Poisson noise models with Beer's law. The algorithm promotes solutions that are sparse in the pixel/voxel--difference domain by introducing additional latent variables, one for each pixel/voxel, and then learning these variables from the data using a hierarchical Bayesian model. Importantly, the proposed AM algorithm is free of any tuning parameters with image quality comparable to standard penalized likelihood methods. Our algorithm exploits optimization transfer principles which reduce the problem into parallel one-dimensional optimization tasks (one for each pixel/voxel), making the algorithm feasible for large-scale problems. This approach considerably reduces the computational bottleneck of ARD associated with the posterior variances. Positivity constraints inherent in transmission tomography problems are also enforced. ...

Highlights

  • Tomographic image reconstruction is the process of estimating an object from measurements of its line integrals along different angles [1]

  • We presented a new framework for automatic relevance determination (ARD), which we call variational ARD (VARD)

  • It provides an extension of previous ARD methods to Poisson noise models for transmission tomography and allows the use of neighborhood penalties that promote sparsity in the pixel/voxel–difference domain

Read more

Summary

Introduction

Tomographic image reconstruction is the process of estimating an object from measurements of its line integrals along different angles [1]. It is desirable to make use of prior knowledge, which includes the statistical characteristics of the measured data, and properties that are expected of the image. For the convenience of readers unfamiliar with ARD we first provide background to explain the potential advantages of ARD motivating this work and discuss the key differences between ARD and other more common sparse estimation methods. Statistical iterative methods for CT [6, 9, 11, 13] cast the problem of image reconstruction as a penalized maximum likelihood estimation problem, given by (1.2). For many common choices of p(x|β) considered below, decreasing β will put more weight on the datafit term and will result in increased variance and in noisier images. The solution in (1.2) can be interpreted as the MAP solution, i.e., the maximizing point of the posterior distribution, which is given by Bayes’ theorem (1.3)

Objectives
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call