Abstract
B-mode ultrasound can be used to image musculoskeletal tissues, but one major bottleneck is analyses of muscle architectural parameters (i.e., muscle thickness, pennation angle and fascicle length), which are most often performed manually. In this study we trained two different neural networks (classic U-Net and U-Net with VGG16 pre-trained encoder) to detect muscle fascicles and aponeuroses using a set of labeled musculoskeletal ultrasound images. We determined the best-performing model based on intersection over union and loss metrics. We then compared neural network predictions on an unseen test set with those obtained via manual analysis and two existing semi/automated analysis approaches (simple muscle architecture analysis [SMA] and UltraTrack). DL_Track_US detects the locations of the superficial and deep aponeuroses, as well as multiple fascicle fragments per image. For single images, DL_Track_US yielded results similar to those produced by a non-trainable automated method (SMA; mean difference in fascicle length: 5.1 mm) and human manual analysis (mean difference: -2.4 mm). Between-method differences in pennation angle were within 1.5°, and mean differences in muscle thickness were less than 1 mm. Similarly, for videos, there was overlap between the results produced with UltraTrack and DL_Track_US, with intraclass correlations ranging between 0.19 and 0.88. DL_Track_US is fully automated and open source and can estimate fascicle length, pennation angle and muscle thickness from single images or videos, as well as from multiple superficial muscles. We also provide a user interface and all necessary code and training data for custom model development.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.