Abstract

Ultrasound computed tomography (UCT) has attracted increasing attention due to its potential for early breast cancer diagnosis and screening. Synthetic aperture imaging is a widely used means for reflection UCT image reconstruction, due to its ability to produce isotropic and high-resolution anatomical images. However, obtaining fully sampled UCT data from all directions over multiple transmissions is a time-consuming scanning process. Even though sparse transmission strategy could mitigate the data acquisition complication, image quality reconstructed by traditional Delay and Sum (DAS) methods may degrade substantially. This study presents a deep learning framework based on a conditional generative adversarial network, UCT-GAN, to efficiently reconstruct reflection UCT image from sparse transmission data. The evaluation experiments using breast imaging data in vivo show that the proposed UCT-GAN is able to generate high-quality reflection UCT images when using 8 transmissions only, which are comparable to that reconstructed from the data acquired by 512 transmissions. Quantitative assessment in terms of peak signal-to-noise ratio (PSNR), normalised mean square error (NMSE), and structural similarity index measurement (SSIM) show that the proposed UCT-GAN is able to efficiently reconstruct high-quality reflection UCT images from sparsely available transmission data, outperforming several other methods, such as RED-GAN, DnCNN-GAN, BM3D. In the experiment of 8-transmission sparse data, the PSNR is 29.52 dB, and the SSIM is 0.7619. The proposed method has the potential of being integrated into the UCT imaging system for clinical usage.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.