Abstract

Purpose: The role of therapeutic drug monitoring (TDM) in optimizing the use of the proliferation signal inhibitor everolimus was examined in a large heart transplant trial (N=634) comparing two doses of everolimus (1.5 and 3 mg/d) vs azathioprine (AZA, 1–3 mg/kg/d) given with CsA microemulsion and corticosteroids. Overall efficacy and intravascular ultrasound (to assess incidence and progression of allograft vasculopathy) was significantly superior for both everolimus doses compared to AZA, while everolimus tolerability was optimal in the 1.5 mg group. Purpose of this investigation was to find the effective trough level concentration of everolimus and to assess the future role of TDM.Method: Cox proportional hazard regression, median-effect, time-to-event analyses and simulations were applied to assess the exposure-efficacy relationship, the minimal effective everolimus concentration and the benefit of future TDM.Results: PK/PD analyses suggested that the minimal effective level is 3 ng/mL. The results were robust when looking at events up to 450 days. Patients with an everolimus exposure <3 ng/mL experienced BPAR rates of 44% (AZA 46%), 3-8 ng/mL or ≥ 8 ng/mL had rates of 24% or 17%, respectively. Kaplan-Meier estimates for the <3, 3-8 and ≥ 8 ng/mL ranges confirmed these findings; the risk of BPAR for the < 3 vs. 3–8 ng/mL group was 2.5-fold increased (p=0.0001) while 3–8 vs ≥ 8 ng/mL had the same risk. CsA exposure appeared to have a beneficial influence in reducing early rejections. Creatinine increase was significantly associated to CsA exposure, but not to everolimus exposure. The simulation results suggested that TDM may optimize efficacy and safety by allowing for initial use of 1.5 mg/d with increases in dosage when the trough concentration is below 3 ng/mL.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call