Abstract

Risk stratification is an important public health priority that is central to clinical decision making and resource allocation. The aim of this study was to examine how different combinations of self-rated and objective health status predict all-cause mortality and leading causes of death in the UK. The UK Biobank study recruited > 500,000 participants between 2006 and 2010. Self-rated health was assessed using a single-item question and health status was derived from medical history, including data on 81 cancer and 443 non-cancer illnesses. Analyses included > 370,000 middle-aged and older adults with a median follow-up of 11.75 (IQR = 1.4) years, yielding 4,320,270 person-years of follow-up. Compared to individuals with excellent self-rated health and favourable health status, individuals with other combinations of self-rated and objective health status had a greater mortality risk, with hazard ratios ranging from HR = 1.22 (95% CI 1.15–1.29, PBonf. < 0.001) for individuals with good self-rated health and favourable health status to HR = 7.14 (95% CI 6.70–7.60, PBonf. < 0.001) for individuals with poor self-rated health and unfavourable health status. Our findings highlight that self-rated health captures additional health-related information and should be more widely assessed. The cross-classification between self-rated health and health status represents a straightforward metric for risk stratification, with applications to population health, clinical decision making and resource allocation.

Highlights

  • Risk stratification is an important public health priority that is central to clinical decision making and resource allocation

  • Unfavourable health status was associated with greater hazards of all-cause mortality than favourable health status, with hazard ratios ranging from Hazard ratios (HRs) = 2.05 in Model 1 to HR = 1.85 in Model 3 (Supplement Table 3)

  • Compared to excellent self-rated health and favourable health status, all other levels of the full health crossclassification were associated with greater hazards, ranging from HR = 1.22 for good self-rated health and favourable health status to HR = 7.14 for poor self-rated health and unfavourable health status

Read more

Summary

Introduction

Risk stratification is an important public health priority that is central to clinical decision making and resource allocation. The cross-classification between self-rated health and health status represents a straightforward metric for risk stratification, with applications to population health, clinical decision making and resource allocation. Self-rated health is used extensively in epidemiological and public health research, and many studies have shown that it predicts morbidity and m­ ortality[1] It is a single item measure of subjective health status and likely encompasses biological, psychological, functional and socio-economic dimensions, including quality of life. A study of 1,322 community-dwelling elderly aged 60 or older who participated in the Bambuí Cohort Study of Aging in Brazil examined how well self-rated health predicted 10-year mortality, compared to a comprehensive health score derived from objective clinical ­measures[10]. The cross-classification between self-rated health and objective health status based on medical history or clinical measures could represent a readily available metric for risk stratification. The identification of at-risk populations is an important public health priority that is central to clinical decision making and resource allocation

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call