Abstract

Low-dose computed tomography (LdCT) imaging can greatly reduce the radiation dose imposed to patient, however it leads to the low signal-to-noise ratio (SNR) measured projection data. Using conventional analytical reconstruction method (e.g., filtered back-projection method), the reconstruction results usually suffer from serious noise in LdCT. To obtain high-quality CT images, iterative reconstruction method combined with prior knowledge of the object is of great importance. In this work, both structural group sparsity and gradient prior sparsity are jointed as a novel regularization constraint in the proposed CT reconstruction model. To solve the optimization-based CT reconstruction problem, original problem was transformed into a series of sub-problems based on alternating direction method of multipliers framework. The merit of the proposed joint regularization method is that global and local sparsity are both utilized. To valid the performance of proposed reconstruction algorithm, we did simulated experiments with different noise levels and real data studies. The qualitative and quantitative analyses show that the proposed reconstruction algorithm has better performance than other iterative reconstruction algorithms. What's more, compared to the existing iterative reconstruction methods, the proposed reconstruction algorithm can well reconstructed important structure features and effectively suppress the noise and artifacts.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.