Abstract

Rotors, regions of spiral wave reentry in cardiac tissues, are considered as the drivers of atrial fibrillation (AF), the most common arrhythmia. Whereas physics-based approaches have been widely deployed to detect the rotors, in-depth knowledge in cardiac physiology and electrogram interpretation skills are typically needed. The recent leap forward in smart sensing, data acquisition, and Artificial Intelligence (AI) has offered an unprecedented opportunity to transform diagnosis and treatment in cardiac ailment, including AF. This study aims to develop an image-decomposition-enhanced deep learning framework for automatic identification of rotor cores on both simulation and optical mapping data. We adopt the Ensemble Empirical Mode Decomposition algorithm (EEMD) to decompose the original image, and the most representative component is then fed into a You-Only-Look-Once (YOLO) object-detection architecture for rotor detection. Simulation data from a bi-domain simulation model and optical mapping acquired from isolated rabbit hearts are used for training and validation. This integrated EEMD-YOLO model achieves high accuracy on both simulation and optical mapping data (precision: 97.2%, 96.8%, recall: 93.8%, 92.2%, and F1 score: 95.5%, 94.4%, respectively). The proposed EEMD-YOLO yields comparable accuracy in rotor detection with the gold standard in literature.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call