Abstract
Objective of the study. To develop and evaluate the effectiveness of a technology for segmenting the pancreatic parenchyma and its hyper- and hypovascular lesions on abdominal computed tomography (CT) scans using deep machine learning.Materials and methods. CT scans from the database of the A.V. Vishnevsky National Medical Research Center of Surgery were used for training and testing the algorithms – a total number of approximately 150 studies (arterial and venous phases). A test dataset of 46 anonymized CT scans (arterial and venous phases) was prepared for validation of the obtained algorithms, independently assessed by expert physicians. The primary segmentation neural network used is nn-UNet (M. Antonelli et al., 2022).Results. The average accuracy of the test dataset for the model determining segmentation masks of the pancreas on CT images had an AUC of 0.8 for the venous phase and 0.85 for the arterial phase. The segmentation masks of pancreatic formations had an AUC of 0.6.Conclusion. Automated segmentation of the pancreatic parenchyma structure using deep machine learning technologies demonstrated high accuracy. However, the segmentation of hypo- and hypervascular pancreatic lesions requires improvement. The overlap of the masks showed a rather low result, but in all cases, the location of the pathological formation was correctly identified by the algorithm. Enhancing the training dataset and the algorithm used could increase the accuracy of the algorithm.No false negative results were obtained when detecting pancreatic formations; in all cases, the INS detected “suspicious” areas of the pancreatic parenchyma. This can help reduce the omission of pancreatic pathologies in CT scans, and their further assessment can be carried out by the radiologist himself.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.