Abstract

In this research work, Semantic-Preserved Generative Adversarial Network optimized by Piranha Foraging Optimization for Thyroid Nodule Classification in Ultrasound Images (SPGAN-PFO-TNC-UI) is proposed. Initially, ultrasound images are gathered from the DDTI dataset. Then the input image is sent to the pre-processing step. During pre-processing stage, the Multi-Window Savitzky-Golay Filter (MWSGF) is employed to reduce the noise and improve the quality of the ultrasound (US) images. The pre-processed output is supplied to the Generalized Intuitionistic Fuzzy C-Means Clustering (GIFCMC). Here, the ultrasound image's Region of Interest (ROI) is segmented. The segmentation output is supplied to the Fully Numerical Laplace Transform (FNLT) to extract the features, such as geometric features like solidity, orientation, roundness, main axis length, minor axis length, bounding box, convex area, and morphological features, like area, perimeter, aspect ratio, and AP ratio. The Semantic-Preserved Generative Adversarial Network (SPGAN) separates the image as benign or malignant nodules. Generally, SPGAN does not express any optimization adaptation methodologies for determining the best parameters to ensure the accurate classification of thyroid nodules. Therefore, the Piranha Foraging Optimization (PFO) algorithm is proposed to improve the SPGAN classifier and accurately identify the thyroid nodules. The metrics, like F-score, accuracy, error rate, precision, sensitivity, specificity, ROC, computing time is examined. The proposed SPGAN-PFO-TNC-UI method attains 30.54%, 21.30%, 27.40%, and 18.92% higher precision and 26.97%, 20.41%, 15.09%, and 18.27% lower error rate compared with existing techniques, like Thyroid detection and classification using DNN with Hybrid Meta-Heuristic and LSTM (TD-DL-HMH-LSTM), Quantum-Inspired convolutional neural networks for optimized thyroid nodule categorization (QCNN-OTNC), Thyroid nodules classification under Follow the Regularized Leader Optimization based Deep Neural Networks (CTN-FRL-DNN), Automatic classification of ultrasound thyroids images using vision transformers and generative adversarial networks (ACUTI-VT-GAN) respectively.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.