This editorial refers to ‘Novel genetic markers improve measures of atrial fibrillation risk prediction’, by B.M. Everett et al. , doi:10.1093/eurheartj/eht033 Atrial fibrillation (AF) is the most prevalent cardiac arrhythmia and is associated with substantial morbidity, mortality, and healthcare costs. Successful interventions to prevent AF in the general population remain limited. Therefore, clinical management of patients with AF is aimed at treating symptoms and minimizing the risk of adverse consequences such as heart failure and ischaemic stroke. Nevertheless, preventing AF remains a desirable public health objective as it stands to reason that averting AF altogether may reduce downstream morbidity and costs attributable to the arrhythmia. In order to prevent AF, it is necessary to have a clear understanding of the gradient of AF risk in the population. Proper AF risk stratification could theoretically enable the design of trials testing the utility of AF detection in high-risk patients or even of specific interventions for AF prevention. Established clinical risk factors for AF include advancing age, male sex, hypertension, and heart failure.1 Over the past decade, investigators have identified a number of novel factors associated with AF, including excessive alcohol consumption, birth weight, chronic kidney disease, obesity, and various serum biomarkers, among others.1 Faced with a multitude of AF risk factors and a goal of identifying individuals at greatest risk for the arrhythmia, investigators have recently developed risk functions to predict new-onset AF.2–5 In order to enhance their applicability to the clinical setting, these risk functions have largely included traditional AF clinical risk factors. In general, AF risk functions developed thus far perform well but remain suboptimal, correctly assigning higher predicted risks to those who subsequently develop AF ∼ 60–80% of the time. In addition, several novel risk factors for AF have been incorporated into existing clinical risk …