Abstract

The Cox model has been the mainstay of survival analysis in the critically ill and time-dependent covariates have infrequently been incorporated into survival analysis. To model 28-day survival of patients with acute lung injury (ALI) and acute respiratory distress syndrome (ARDS), and compare the utility of Cox and accelerated failure time (AFT) models. Prospective cohort study of 168 adult patients enrolled at diagnosis of ALI in 21 adult ICUs in three Australian States with measurement of survival time, censored at 28 days. Model performance was assessed as goodness-of-fit [GOF, cross-products of quantiles of risk and time intervals (P > or = 0.1), Cox model] and explained variation ('R2', Cox and ATF). Over a 2-month study period (October-November 1999), 168 patients with ALI were identified, with a mean (SD) age of 61.5 (18) years and 30% female. Peak mortality hazard occurred at days 7-8 after onset of ALI/ARDS. In the Cox model, increasing age and female gender, plus interaction, were associated with an increased mortality hazard. Time-varying effects were established for patient severity-of-illness score (decreasing hazard over time) and multiple-organ-dysfunction score (increasing hazard over time). The Cox model was well specified (GOF, P > 0.34) and R2 = 0.546, 95% CI: 0.390, 0.781. Both log-normal (R2 = 0.451, 95% CI: 0.321, 0.695) and log-logistic (R2 0.470, 95% CI: 0.346, 0.714) AFT models identified the same predictors as the Cox model, but did not demonstrate convincingly superior overall fit. Time dependence of predictors of survival in ALI/ARDS exists and must be appropriately modelled. The Cox model with time-varying covariates remains a flexible model in survival analysis of patients with acute severe illness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call