Abstract

The L 1-norm regularization is usually used in positron emission tomography (PET) reconstruction to suppress noise artifacts while preserving edges. The alternating direction method of multipliers (ADMM) is proven to be effective for solving this problem. It sequentially updates the additional variables, image pixels, and Lagrangian multipliers. Difficulties lie in obtaining a nonnegative update of the image. And classic ADMM requires updating the image by greedy iteration to minimize the cost function, which is computationally expensive. In this paper, we consider a specific application of ADMM to the L 1-norm regularized weighted least squares PET reconstruction problem. Main contribution is derivation of a new approach to iteratively and monotonically update the image while self-constraining in the nonnegativity region and the absence of a predetermined step size. We give a rigorous convergence proof on the quadratic subproblem of the ADMM algorithm considered in the paper. A simplified version is also developed by replacing the minima of the image-related cost function by one iteration that only decreases it. The experimental results show that the proposed algorithm with greedy iterations provides a faster convergence than other commonly used methods. Furthermore, the simplified version gives a comparable reconstructed result with far lower computational costs.

Highlights

  • Positron emission tomography (PET) is an important imaging tool in modern medicine and provides noninvasive quantification of the biochemical and biological processes inside living subjects

  • Several reconstruction methods have been developed and applied in clinical practice. These methods can be roughly divided into two categories: analytical methods and iterative methods

  • A basic target of PET reconstruction is to solve a system of the following form: Y = PX + S, (1)

Read more

Summary

Introduction

Positron emission tomography (PET) is an important imaging tool in modern medicine and provides noninvasive quantification of the biochemical and biological processes inside living subjects. The subgradient-based method [18] has been developed for solving convex and nonconvex optimization problems; this method takes a subgradient related surrogate function at each step to obtain the update. Another method is the alternating direction method of multipliers (ADMM) [19, 20]. By applying distribution optimization for V and X and dual ascent to μ, a unified framework can be introduced to solve the L1-norm regularized WLS reconstruction problems. The conjugate gradient method is a popular approach, which is often implemented as an iterative algorithm applicable to sparse systems for large-scale problems. The results show that the simplified version provides a comparable reconstructed result but at a considerably lower computational cost compared to the existing methods

Methodology
Convergence
Experiments
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call