Abstract

ObjectiveValvular heart diseases (VHD) pose a significant public health burden, and deciding the best treatment strategy necessitates accurate assessment of heart valve function. Transthoracic Echocardiography (TTE) is the key modality to evaluate VHD, but the lack of standardized quantitative measurements leads to subjective and time-consuming assessments. We aimed to use deep learning to automate the extraction of mitral valve (MV) leaflets and annular hinge points from echocardiograms of the MV, improving standardization and reducing workload in quantitative assessment of MV disease.MethodsWe annotated the MV leaflets and annulus points in 2931 images from 127 patients. We propose an approach for segmenting the annotated features using Attention UNet with deep supervision and weight scheduling of the attention coefficients to enforce saliency surrounding the MV. The derived segmentation masks were used to extract quantitative biomarkers for specific MV leaflet scallops throughout the heart cycle.ResultsEvaluation performance was summarized using dice: 0.63 ± 0.14, annulus error: 3.64 ± 2.53, and leaflet angle error: 8.7 ± 8.3 ∘. Leveraging Attention UNet with deep supervision robustness of clinically relevant metrics was improved compared to UNet, reducing standard deviation by: 2.7 ∘ (angle error), 0.73 mm (annulus error). We correctly identified cases of MV prolapse, cases of stenosis, and healthy references from a clinical material using the derived biomarkers.ConclusionRobust deep learning segmentation and tracking of MV morphology and motion is possible by leveraging attention gates and deep supervision, and holds promise for enhancing VHD diagnosis and treatment monitoring.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.