Abstract
Currently, effective treatment options for heart failure are lacking. Methods to monitor cardiac function longitudinally can accelerate preclinical and clinical research into disease progression and novel drugs. However, manual image analysis of cardiac substructures is resource-intensive and error-prone. While automated methods exist for clinical CT images, translating these to preclinical μCT data is challenging. We aimed at employing deep-learning to automatically extract quantitative data on cardiac functional parameters, developing a uniform method for both μCT and CT images. For this, we collected 110 contrast-enhanced μCT images of wild-type and accelerated aging (Ercc1 Δ/- ) mice, as well as a public dataset of 60 segmented cardiac CT images. Substructures, i.e., the left ventricle, left myocardium and right ventricle, were manually segmented in 24 μCT images to create a training set. After template-based heart detection, two separate segmentation neural network architectures were trained using the nnU-Net framework. Automatic and manual segmentations of the μCT training set were nearly identical (mean Dice score 0.983). The estimated median Dice score (0.940) of the test set results was comparable to existing methods for μCT image segmentation. Also, the automatic volume metrics were similar to manual observations by an expert. In Ercc1 Δ/- mice, ejection fractions had significantly decreased, and myocardial mass increased by age 24 weeks. The mean Dice score of the CT segmentation results (0.925 ± 0.019, n = 40) was superior to those achieved by state-of-the-art algorithms. While further optimization and validation of the models is recommended, automated data extraction expands the use application of (μ)CT imaging, while reducing subjectivity and workload. In conclusion, the proposed method offers a pipeline to efficiently monitor disease progression in animals and humans, allowing uniform translation from preclinical to clinical studies.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.