Abstract

Sub-optimal atrioventricular delay (AVD) is one of the main causes of non-responder for cardiac resynchronization therapy (CRT). Recently, device-based algorithms (DBAs) that provide optimal AVD based on intracardiac electrograms, have been developed. However, their long-term effectiveness is still unknown. This study aims to investigate the effect of optimizing AVD using DBAs over a long period, on the prognosis of patients undergoing CRT. A total of 118 patients who underwent CRT at our hospital between April 2008 and March 2018, were retrospectively reviewed; 61 of them with optimizing AVD using DBAs were classified into the treated group (group 1), and the remaining 57 were classified into the control group (group 2). The median follow-up period was 46.0 months. The responder and survival rate in group 1 were significantly better than those in group 2 (group 1 vs. group 2: responder rate = 64% vs. 46%, p = 0.046; survival rate: 85.2% vs. 64.9%, p = 0.02). Moreover, investigating only the non-responder population showed that group 1 had an improved survival rate compared to group 2 (group 1 vs. group 2 = 72.7% vs. 45.1%, p = 0.02). Optimizing AVD using DBAs was a significant contributor to the improved survival rate in CRT non-responders in multivariate analysis (HR 3.6, p = 0.01). In conclusion, the long-term optimizing AVD using DBAs improved the survival rate in CRT and the prognosis of CRT non-responders, as well.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call