Abstract

To develop a unified deep-learning-based method for automated intracerebral haemorrhage (ICH) segmentation on computed tomography (CT) images with different layer thickness parameters. A total of 134 patients from an internal database (67 patients) and an external database (called CQ500, 67 patients) were employed. The CT examinations included multiple layer thicknesses such as 0.625, 1.25 and 5 mm. ICH segmentation was performed by a coarse-to-fine strategy, including three stages of three-dimensional (3D) skull-stripping segmentation, 3D ICH localisation segmentation, and two-dimensional (2D) ICH fine segmentation. The three stages shared the same sICHNet for segmentation and employed mixed precision training to speed up the training process. In addition, the 3D contextual information from CT was maintained by formatting the consecutive slices into a three-channel image in the 2D ICH fine segmentation. Experimental results demonstrated that the coarse-to-fine segmentation strategy achieved the best performance with a mean Dice coefficient of 0.887. ICH volume consistency was observed (p<0.05) between manual and automatic segmentations, and between segmentations of same individual but with different layer thicknesses in internal dataset and external database. Automated segmentation achieved a relatively consistent segmentation time of 20.01±2.03 seconds no matter the layer thickness of the CT images and the extent of ICH. Longitudinal studies with conservative management and surgical treatment were also visualised. The coarse-to-fine deep learning strategy achieved the best ICH segmentation performance on CT images. The automated segmentation was 5-42 times faster than manual segmentation given ICH of different extents and using different layer thickness parameters.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.