Abstract
Accurately segmenting head and neck cancer (HNC) tumors in medical images is crucial for effective treatment planning. However, current methods for HNC segmentation are limited in their accuracy and efficiency. The present study aimed to design a model for segmenting HNC tumors in three-dimensional (3D) positron emission tomography (PET) images using Non-Local Means (NLM) and morphological operations. The proposed model was tested using data from the HECKTOR challenge public dataset, which included 408 patient images with HNC tumors. NLM was utilized for image noise reduction and preservation of critical image information. Following pre-processing, morphological operations were used to assess the similarity of intensity and edge information within the images. The Dice score, Intersection Over Union (IoU), and accuracy were used to evaluate the manual and predicted segmentation results. The proposed model achieved an average Dice score of 81.47 ± 3.15, IoU of 80 ± 4.5, and accuracy of 94.03 ± 4.44, demonstrating its effectiveness in segmenting HNC tumors in PET images. The proposed algorithm provides the capability to produce patient-specific tumor segmentation without manual interaction, addressing the limitations of current methods for HNC segmentation. The model has the potential to improve treatment planning and aid in the development of personalized medicine. Additionally, this model can be extended to effectively segment other organs from limited annotated medical images.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.