Abstract

This analysis is a retrospective characterization of evolving patterns in donor and recipient risk factors for early and late outcomes (survival and freedom from rejection) along with determinants of hospital and 1-year mortality after heart transplantation over a 15-year experience in a single center. Profiles and outcomes were evaluated for procedures performed between 1988 and 1995 (group A, n = 105) versus 1996 and 2003 (group B, n = 218). The following parameters were considered: pretransplant diagnosis, recipient age UNOS status, donor age, total postretrieval ischemic time, donor/recipient size match, and degree of myocardial necrosis at biopsy. Recipients in group B were significantly more compromised as demonstrated by UNOS status (11.4% vs 19.3%; P = .05) and pretransplant pulmonary vascular resistance (2.3 ± 1.5 vs 3.1 ± 1.5; P = .04). Marginal donors were more frequently used for group B procedures (21.9% vs 47.7%; P < .0001). Outcomes were significantly more favorable among group B patients in terms of hospital mortality (18.1% vs 10.6%; P = .046), and 1- and 5-year actuarial survival (72.4% vs 83.4%, 60% vs 73.3%, respectively; P = .006). Analysis of the causes of death disclosed a significant reduction in fatal events due to graft failure and acute rejection in group B. No difference emerged with regard to actual freedom from acute rejection. Determinants of hospital mortality were pretransplant diagnosis, UNOS status, donor age, and cardioplegic solution. Transplant era, recipient age, infectious episodes, and ischemic necrosis at biopsy were risk factors for 1-year mortality. We conclude that despite extensive usage of marginal donors and selection of worse candidates, significantly better outcomes were achieved due to improvements in global management strategies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call