Abstract
ObjectiveTo analyze the application of deep learning computed tomography (CT) texture analysis in TNM staging of gastric cancer. MethodsUse deep learning software networks to select GoogLeNet and AlexNet models that perform well in image classification work to diagnose gastric cancer pathological images. Based on the characteristics of pathological medical images, the GoogLeNet model was optimized to reduce computational costs while ensuring diagnostic accuracy. On this basis, model fusion is proposed, which integrates network models with different structures and depths to learn more image features and obtain more useful pathological information about gastric cancer. ResultsA deep convolutional neural network was used to automatically classify gastric cancer pathological images using AlexNet and GogLeNet models with significant structural differences. It can be seen that the model achieved high diagnostic accuracy, with a sensitivity of 97.60% and a specificity of 99.49%. However, the experimental method relied more on manually selected pathological features of gastric cancer. The AUC values of maximum frequency, skewness, and kurtosis during the venous phase were 0.735, 0.711, and 0.720, respectively (all P < 0.05). ConclusionThe improved model has the characteristics of both network structures and is more targeted at gastric cancer pathological sections, improving the sensitivity of gastric cancer pathological section recognition. Therefore, deep learning for CT texture analysis can effectively evaluate gastric cancer TNM staging and guide clinical treatment for diagnosing gastric cancer, which has a positive significance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Radiation Research and Applied Sciences
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.