Abstract
Most existing models for CT kernel conversion take images reconstructed with a single predetermined source kernel as input and convert them to images that are reconstructed with a target kernel. However, these models can achieve even better performance if they leverage complementary information obtained from images reconstructed with multiple different kernels. In many clinical practice scenarios, only images with one kernel can beacquired. We propose a privileged knowledge learning framework that learns privileged knowledge of other source kernels available only in the training data (called privileged information) to guide the conversion from a specific single source kernel to the target kernel, via a joint prediction (JP)task. We construct an ensemble of kernel-specific (KS) tasks where a KS network (KSNet) takes images reconstructed with a specific source kernel as input and converts them to images reconstructed with the target kernel. Then, a JP task is designed to provide extra regularization, which helps each KSNet learn more informative feature representations for kernel conversion, such as detail and structure representations. Meanwhile, we use a cross-shaped window-based attention mechanism in the JP task to highlight the most relevant features to strengthen privileged knowledge learning, thereby alleviating the problems of redundant noise unrelated to images reconstructed with target kernel and inconsistent features that arise from images reconstructed with different kernels. All KSNets can be trained collaboratively by using a JP task to improve the performance of each individual KSNet. We extensively evaluate our method on a clinical dataset with scanners from three manufacturers, that is, Siemens, GE and Philips. The experimental results demonstrate that our privileged knowledge learning framework is effective in improving CT kernel conversion. Through both quantitative and qualitative research, our privileged knowledge learning framework improves the kernel conversion results, thereby contributing to the improvement of diagnostic accuracy and the advancement of comparative research in quantitativemeasurements.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.