Abstract
Ring artifacts pose a major barrier to obtaining precise reconstruction in computed tomography (CT). The presence of ring artifacts complicates the use of automatic means of processing CT reconstruction results, such as segmentation, correction of geometric shapes, alignment of reconstructed volumes. Although there are numerous efficient methods for suppressing ring artifacts, many of them appear to be manual. Along with this, a large proportion of the automatic methods cope unsatisfactorily with the target task while requiring computational capacity. The current work introduces a projection data preprocessing method for suppressing ring artifacts that constitutes a compromise among the outlined aspects – automaticity, high efficiency and computational speed. Derived as the automation of the classical sinogram normalization method, the proposed method specific advantages consist in adaptability in relation to the filtered sinograms and the edge-preservation property proven within the experiments on both synthetic and real CT data. Concerning the challenging open-access data, the method has performed superior quality comparable to that of the advanced methods: it has demonstrated 70.4% ring artifacts suppression percentage (RASP) quality metric. In application to our real laboratory CT data, the proposed method allowed us to gain significant refinement of the reconstruction quality which has not been surpassed by a range of compared manual ring artifacts suppression methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.