Abstract
BackgroundAortic regurgitation (AR) is a common heart disease, with a relatively high prevalence of 4.9% in the Framingham Heart Study. Because the prevalence increases with advancing age, an upward shift in the age distribution may increase the burden of AR. To provide an effective screening method for AR, we developed a deep learning–based artificial intelligence algorithm for the diagnosis of significant AR using electrocardiography (ECG). MethodsOur dataset comprised 29,859 paired data of ECG and echocardiography, including 412 AR cases, from January 2015 to December 2019. This dataset was divided into training, validation, and test datasets. We developed a multi-input neural network model, which comprised a two-dimensional convolutional neural network (2D-CNN) using raw ECG data and a fully connected deep neural network (FC-DNN) using ECG features, and compared its performance with the performances of a 2D-CNN model and other machine learning models. In addition, we used gradient-weighted class activation mapping (Grad-CAM) to identify which parts of ECG waveforms had the most effect on algorithm decision making. ResultsThe area under the receiver operating characteristic curve of the multi-input model (0.802; 95% CI, 0.762–0.837) was significantly greater than that of the 2D-CNN model alone (0.734; 95% CI, 0.679–0.783; p<0.001) and those of other machine learning models. Grad-CAM demonstrated that the multi-input model tended to focus on the QRS complex in leads I and aVL when detecting AR. ConclusionsThe multi-input deep learning model using 12-lead ECG data could detect significant AR with modest predictive value.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.