Abstract

The automatic segmentation of the pharyngeal airway space has many potential medical use, one of which is to help facilitate the creation of the Tubingen Palatal Plate. Therefore, it is of great importance to understand which methods are suitable for this task. Here, neural network based solutions available in the literature are compared to find the best methods. Neural network models were chosen to encompass a diverse landscape. Some models were taken from the general semantic segmentation literature, while others were taken from the medical or pharyngeal airway space segmentation literature. Some models are convolutional neural networks, while others are transformer-based model or a mix of both convolutional and transformer-based model. These models include 2d/3d U-Net, Deeplabv3, YOLOv8, Swinv2 UNETR, SegFormer, and 3D MRU-Net. Furthermore, additional strategies to enhance performance were also considered. These strategies consisted of training two separate networks in multiple stages as well leveraging unlabeled data to pretrain the neural networks before fine-tuning them on the labeled data. It was found that out of all the models considered here, the 2d U-Net performed the best achieving an average dice score of 0.9180 ± 0.0111. Out of all the strategies to enhance performance, only two strategies improve the actual results but only by a small margin. Therefore, these strategies can be consider if a small increase in performance is desired from the 2d U-Net at the expense of computational resource.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.