Abstract

Abstract Background Early atrial fibrillation (AF) diagnosis offers an opportunity to initiate management to reduce stroke risk. Hand-held single-lead electrocardiograph (ECG) devices are widely available, yet there is little data on the implementation of AF screening using the devices in high-AF-risk communities and examining the accuracy of the devices over time. Such data may provide insights into the possibility of using artificial intelligence to scale up screening. Purpose To examine the accuracy of AliveCor ECG devices’ automated interpretation against clinicians’ diagnoses and explore the characteristics of the ECGs of older community-dwelling people in a real-world setting. Methods This is an analysis of ECGs received from people aged ≥75 years participating in an implementation trial of a community AF screening program over 12 months. Clinicians reviewed all ECGs transmitted to a central portal. Clinicians’ interpretations were taken as diagnostically accurate. Participants and their general practitioners (GPs) were notified of AF. The devices’ automatic ECG interpretations, including the participants’ pulse rates automatically recorded by the devices, were extracted from the central portal and analysed using R v3.6.0. Results 200 participants were enrolled. From May 2021 to Feb 2023, 30,040 ECGs were received; of these traces, clinicians confirmed AF in 479 traces and no AF in 29,561 traces. The devices’ automatic algorithm had a sensitivity of 91.4 % (i.e., detected AF in 438 of these 479 traces) and specificity of 96.6% (no AF: 28,546 / 29,561 traces). In total, the devices’ automated interpretation identified 1,453 ECGs with "Possible AF", of which clinicians confirmed 911 (62.7%) were in sinus rhythm and 438 (30.1%) had AF (Table 1). Breaking down the statistics for various rhythms and comparing the devices’ automatic algorithms with the clinicians’ diagnoses, discrepancies were found in 5296 ECG traces (Kappa = 0.33). Of these, 4277 (80.8%) were interpreted by the device automated algorithm as "Unreadable" (interferences), "Unclassified" (the device was unable to determine a result), and "Too short" (< 30 seconds of recording). Excluding the 3799 "Unclassified’ traces, the level of agreement between the ECG devices and clinicians improved (Kappa 0.63). The device automatically reported "Unclassified" and "Possible AF" rhythm traces had wide dispersion of the participants’ pulse rates (Figure 1). Conclusions The ECG devices had high sensitivity to detect AF but a low level of agreement in differentiating various rhythms. The characteristics of the longitudinal ECG rhythm traces, including the participants’ pulse rates over time coupled with the clinicians’ diagnoses of this large dataset, could be used to improve the devices’ automated algorithm and inform artificial intelligence strategies for large-scale community AF screening programs.Table 1Figure 1

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call