Abstract

Multi-material decomposition is an interesting topic in dual-energy CT (DECT) imaging; however, the accuracy and performance may be limited using the conventional algorithms. In this work, a novel multi-material decomposition network (MMD-Net) is proposed to improve the multi-material decomposition performance of DECTimaging. To achieve dual-energy multi-material decomposition, a deep neural network, named as MMD-Net, is proposed in this work. In MMD-Net, two specific convolutional neural network modules, Net-I and Net-II, are developed. Specifically, Net-I is used to distinguish the material triangles, while Net-II predicts the effective attenuation coefficients corresponding to the vertices of the material triangles. Subsequently, the material-specific density maps are calculated analytically through matrix inversion. The new method is validated using in-house benchtop DECT imaging experiments with a solution phantom and a pig leg specimen, as well as commercial medical DECT imaging experiments with a human patient. The decomposition accuracy, edge spreading function, and noise power spectrum are quantitativelyevaluated. Compared to the conventional multiple material decomposition (MMD) algorithm, the proposed MMD-Net method is more effective at suppressing image noise. Additionally, MMD-Net outperforms the iterative MMD approach in maintaining decomposition accuracy, image sharpness, and high-frequency content. Consequently, MMD-Net is capable of generating high-quality material decompositionimages. A high performance multi-material decomposition network is developed for dual-energy CTimaging.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.