Abstract

Ultrasound is widely used for image-guided therapy (IGT) in many surgical fields, thanks to its various advantages, such as portability, lack of radiation and real-time imaging. This article presents the first attempt to utilize multiple deep learning algorithms in distal humeral cartilage segmentation for dynamic, volumetric ultrasound images employed in minimally invasive surgery. The dataset, consisting 5,321 ultrasound images were collected from 12 healthy volunteers. These images were randomly split into training and validation sets in an 8:2 ratio. Based on deep learning algorithms, 9 semantic segmentation networks were developed and trained using our dataset at Southern University of Science and Technology Hospital in September 2022. The performance of the networks was evaluated based on their segmenting accuracy and processing efficiency. Furthermore, these networks were implemented in an IGT system to assess their feasibility in 3-dimentional imaging precision. In 2D segmentation, Medical Transformer (MedT) showed the highest accuracy result with a Dice score of 89.4%, however, the efficiency in processing images was relatively lower at 2.6 frames per second (FPS). In 3D imaging, the average root mean square (RMS) between ultrasound (US)-generated models based on the networks and magnetic resonance imaging (MRI)-generated models was no more than 1.12 mm. The findings of this study indicate the technological feasibility of a novel method for real-time visualization of distal humeral cartilage. The increased precision of ultrasound calibration and segmentation are both important approaches to improve the accuracy of 3D imaging.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.