Abstract

In medical imaging, the integration of deep-learning-based semantic segmentation algorithms with preprocessing techniques can reduce the need for human annotation and advance disease classification. Among established preprocessing techniques, Contrast Limited Adaptive Histogram Equalization (CLAHE) has demonstrated efficacy in improving segmentation algorithms across various modalities, such as X-rays and CT. However, there remains a demand for improved contrast enhancement methods considering the heterogeneity of datasets and the various contrasts across different anatomic structures. This study proposes a novel preprocessing technique, ps-KDE, to investigate its impact on deep learning algorithms to segment major organs in posterior-anterior chest X-rays. Ps-KDE augments image contrast by substituting pixel values based on their normalized frequency across all images. We evaluate our approach on a U-Net architecture with ResNet34 backbone pre-trained on ImageNet. Five separate models are trained to segment the heart, left lung, right lung, left clavicle, and right clavicle. The model trained to segment the left lung using ps-KDE achieved a Dice score of 0.780 (SD = 0.13), while that of trained on CLAHE achieved a Dice score of 0.717 (SD = 0.19), p<0.01. ps-KDE also appears to be more robust as CLAHE-based models misclassified right lungs in select test images for the left lung model. The algorithm for performing ps-KDE is available at https://github.com/wyc79/ps-KDE. Our results suggest that ps-KDE offers advantages over current preprocessing techniques when segmenting certain lung regions. This could be beneficial in subsequent analyses such as disease classification and risk stratification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.