Abstract

Deep learning is becoming increasingly popular and available to new users, particularly in the medical field. Deep learning image segmentation, outcome analysis, and generators rely on presentation of Digital Imaging and Communications in Medicine (DICOM) images and often radiation therapy (RT) structures as masks. Although the technology to convert DICOM images and RT structures into other data types exists, no purpose-built Python module for converting NumPy arrays into RT structures exists. The 2 most popular deep learning libraries, Tensorflow and PyTorch, are both implemented within Python, and we believe a set of tools built in Python for manipulating DICOM images and RT structures would be useful and could save medical researchers large amounts of time and effort during the preprocessing and prediction steps. Our module provides intuitive methods for rapid data curation of RT-structure files by identifying unique region of interest (ROI) names and ROI structure locations and allowing multiple ROI names to represent the same structure. It is also capable of converting DICOM images and RT structures into NumPy arrays and SimpleITK Images, the most commonly used formats for image analysis and inputs into deep learning architectures and radiomic feature calculations. Furthermore, the tool provides a simple method for creating a DICOM RT-structure from predicted NumPy arrays, which are commonly the output of semantic segmentation deep learning models. Accessing DicomRTTool via the public Github project invites open collaboration, and the deployment of our module in PyPi ensures painless distribution and installation. We believe our tool will be increasingly useful as deep learning in medicine progresses.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.