More than 90% of patients with left bundle branch block (LBBB) and reduced left ventricular (LV) ejection fraction have LV dyssynchrony and a high probability of response to cardiac resynchronization therapy (CRT). A subgroup of patients with non-specific intraventricular conduction delay (IVCD) have a LBBB-like LV activation pattern when studied using invasive mapping and advanced echocardiographic techniques. These patients also frequently benefit from CRT but these patients have proven difficult to identify using ECG criteria. Cardiovascular magnetic resonance (CMR) imaging indices of dyssynchrony may identify patients with IVCD who may benefit from CRT but their relative accuracies for identification of LV dyssynchrony remains unknown. We compared the LV dyssynchrony classification accuracy of two commonly available CMR indices in a study population of patients with severely reduced LV ejection fraction and no scar, and either LBBB or normal conduction (normal QRS duration and axis, controls). In LBBB (n=44) and controls (n=36), using CMR feature-tracking circumferential strain, dyssynchrony was quantified as the circumferential uniformity ratio estimate (CURE) and the systolic stretch index (SSI). Deidentified CMR image-data were made publicly available. Both CURE and SSI quantified more severe dyssynchrony in LBBB compared to controls (p<0.001 for both). SSI more frequently discriminated LBBB and normal conduction LV activation patterns than CURE (area under the receiver-operating characteristics curve [95% confidence interval] 0.96 [0.92-1.00] for SSI vs 0.76 [0.65-0.86] for CURE, p<0.001). SSI is superior to CURE for discriminating synchronous and dyssynchronous LV activation and should be further studied in the setting of non-LBBB conduction abnormalities.
Read full abstract