Abstract
Neuromuscular diseases are genetic conditions which result in a progressive loss of muscle function. One of the hallmarks is the replacement of muscle by fat tissue which can be quantified using Magnetic Resonance Imaging (qMRI). Although individual muscles are generally affected by this replacement, the corresponding degree of fat infiltration differs from one muscle to another so that Fat Fraction quantification in individual muscles is of importance and this requires a delineation procedure to be performed. Given that the manual delineation is tedious and time consuming, semi-automatic and automatic approaches have been developed over the last decade. More specifically, deep learning approaches have provided promising results for automatic segmentation of medical images and U-Net has been the most largely used Convolutional Neural Network. A modified version of U-Net incorporating an “attention” block (Attention U-Net) has been proposed recently. It has been initially used for the automatic delineation of Pancreas on CT images. In the present work, we intended to compare the performance of 2D U-Net and 2D Attention U-Net for i) the segmentation of individual thigh muscles on MR images from neuropathic patients and controls and ii) the quantification of FF. Our results illustrate that both Attention U-Net and U-Net provide very high Dice scores with a significantly higher value for Attention U-Net (90% to 94.4%) in comparison with U-Net (86% to 94.2%). Nevertheless, a statistical analysis shows that the FF estimation is not significantly impacted by the deviation of the Dice score between the networks. This statistical analysis also shows that Attention U-Net and U-Net allow to estimate a fat fraction comparable with those computed by using the segmentation mask performed by experts.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.