Abstract

Computed tomography for the reconstruction of region of interest (ROI) has advantages in reducing the x-ray dose and the use of a small detector. However, standard analytic reconstruction methods such as filtered back projection (FBP) suffer from severe cupping artifacts, and existing model-based iterative reconstruction methods require extensive computations. Recently, we proposed a deep neural network to learn the cupping artifacts, but the network was not generalized well for different ROIs due to the singularities in the corrupted images. Therefore, there is an increasing demand for a neural network that works well for any ROI size. Two types of neural networks are designed. The first type learns ROI size-specific cupping artifacts from FBP images, whereas the second type network is for the inversion of the truncated Hilbert transform from the truncated differentiated backprojection (DBP) data. Their generalizabilities for different ROI sizes, pixel sizes, detector pitch and starting angles for a short scan are then investigated. Experimental results show that the new type of neural networks significantly outperform existing iterative methods for all ROI sizes despite significantly lower runtime complexity. In addition, performance improvement is consistent across different acquisition scenarios. Since the proposed method consistently surpasses existing methods, it can be used as a general CT reconstruction engine for many practical applications without compromising possible detector truncation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.