Abstract

Microwave-induced thermoacoustic tomography (TAT) is a rapidly-developing noninvasive imaging technique that integrates the advantages of microwave imaging and ultrasound imaging. While an image reconstruction algorithm is critical for the TAT, current reconstruction methods often creates significant artifacts and are computationally costly. In this work, we propose a deep learning-based end-to-end image reconstruction method to achieve the direct reconstruction from the sinogram data to the initial pressure density image. We design a new network architecture TAT-Net to transfer the sinogram domain to the image domain with high accuracy. For the scenarios where realistic training data are scarce or unavailable, we use the finite element method (FEM) to generate synthetic data where the domain gap between the synthetic and realistic data is resolved through the signal processing method. The TAT-Net trained with synthetic data is evaluated through both simulations and phantom experiments and achieves competitive performance in artifact removal and robustness. Compared with other state-of-the-art reconstruction methods, the TAT-Net method can reduce the root mean square error to 0.0143, and increase the structure similarity and peak signal-to-noise ratio to 0.988 and 38.64, respectively. The results obtained indicate that the TAT-Net has great potential applications in improving image reconstruction quality and fast quantitative reconstruction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.