Abstract
In knee-joint magnetic resonance imaging (MRI), articular cartilage is susceptible to artifacts and noise, making it difficult for a single image segmentation algorithm to achieve accurate segmentation. To overcome this challenge, we propose a 3D U-Net neural network method based on prior knowledge (Prior-based 3D U-Net). First, sample data are used to train the 3D U-Net model and calculate an average shape model (ASM) of the knee joint (including the cartilage, distal femur, and proximal tibia). Second, 3D U-Net is used to segment the distal femur and proximal tibia from newly input knee MRI data. Third, we register the newly input knee joints (the distal femur and proximal tibia) with the ASM by constructing a registration transformation RT. Fourth, RT converts ASM into the newly input MRI images, and a predicted cartilage model appears. Finally, the predicted cartilage model, distal femur, and proximal tibia segmented by the previous U-Net step are used as constraints of 3D U-Net to segment the knee cartilage. According to the cartilage segmentation results, the Dice coefficient (DSC) of Prior-based 3D U-Net algorithm was 74.02%, the intersection-over-union (IoU) coefficient was 57.56%, the average symmetric surface distance (ASD) distance was 0.51 mm, the 95% Hausdorff distance (Hd95) was 3.89 mm, and the segmentation time for one MRI image was 0.38 s on average. These quantitative indicators are higher than those obtained from similar U-Net models and fully convolutional networks (FCN) models, indicating that our algorithm could achieve more accurate automatic cartilage segmentation, greatly reducing medical researchers’ burden.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.