Anemia-independent prognostic value of iron deficiency in incident peritoneal dialysis patients.
Background and objectivesIron plays a critical role beyond erythropoiesis, yet the prognostic significance of iron deficiency (ID) independent of anemia remains poorly defined in the peritoneal dialysis (PD) population. This study aimed to evaluate the association between iron status, specifically transferrin saturation (TSAT), and mortality in PD patients, independent of hemoglobin levels.Design, setting, participants, and measurementsWe conducted a retrospective cohort study of 11,013 adults who initiated PD at a large US dialysis network between December 2004 and January 2011. Patients had at least 180 days on PD and baseline data on TSAT, ferritin, hemoglobin, albumin, and white blood cell count. The primary outcome was all-cause mortality. Broadly adjusted associations between iron parameters and mortality were assessed using Cox proportional hazards models and restricted cubic splines, with adjustments for demographic, clinical, treatment-related, and laboratory variables including hemoglobin and ESA use.ResultsIron deficiency, defined as TSAT ≤20%, was present in 10% of patients at PD initiation. The cohort was 54% male and 70% Caucasian, with a mean age of 55 years; 39% had diabetes. While 91% received erythropoiesis-stimulating agents, only 34% received IV iron. After comprehensive adjustment, TSAT ≤20% remained independently associated with increased mortality (adjusted HR: 1.26; 95% CI: 1.12-1.42). Spline analyses showed a sharp rise in mortality risk at TSAT levels below 25%. Ferritin was inconsistently associated with mortality risk. During follow-up, 2704 deaths occurred (24.6% of the cohort) over a median 440-day follow-up.ConclusionsIron deficiency is common in incident PD patients and is associated with increased mortality risk, independent of anemia. These findings challenge current anemia-centric treatment paradigms and suggest that iron status, particularly TSAT, should be routinely assessed in PD patients regardless of hemoglobin levels. A prospective, randomized trial is warranted to evaluate whether proactive iron management improves outcomes in this population.
- Research Article
- 10.1093/ndt/gfaf116.088
- Oct 21, 2025
- Nephrology Dialysis Transplantation
Background and Aims Iron deficiency is an anemia-independent mortality risk in people with chronic kidney disease who do not require dialysis. Less is known regarding this association in dialysis dependent kidney failure, particularly in peritoneal dialysis (PD). Transferrin saturation (TSAT) and ferritin are commonly assessed in dialysis care to evaluate iron stores. This study investigated the risk of death in incident peritoneal dialysis patients based on iron stores and IV iron therapy. Method We evaluated data from incident adult PD patients treated between 2004 and 2011 at a large US dialysis provider network. Patients were included if they were on PD for ≥180 days, had no modality change for ≥90 days, and had at least one measurement of TSAT, ferritin, hemoglobin, and albumin during the 180-day baseline period after initiation of PD. Adjusted Cox proportional hazards model with smoothing splines were used to investigate the hazard ratio for death across continuous values of TSAT and ferritin. Analyses were assessed by IV iron administration status during the baseline period. Adjustments were made for demographic characteristics (age, sex, race, dialysis vintage, and history of hemodialysis before PD start), comorbidities (gastrointestinal bleeding, peripheral vascular disease, hypertension, cardiovascular disease, cerebrovascular disease, heart failure, acute myocardial infarction, diabetes mellitus and cancer), laboratory values (albumin, hemoglobin, white blood cell (WBC) counts), and erythropoiesis-stimulating agent (ESA) usage. Results Out of the final cohort of 11,013 patients (mean age of 55 years, 54% male, 70% Caucasian, albumin of 3.6 g/dL, TSAT of 31%, and ferritin of 445 ng/dL), 34% received IV iron therapy during the 180-day baseline period after starting PD. Majority (90.5%) of the cohort received an ESA dose during baseline. Patients with TSAT levels <25% were observed to have higher risks of death as compared to higher levels (Fig. 1). This pattern was consistent in the subgroup of patients who did not receive IV iron and was independent of hemoglobin levels. Despite this, the impact of low TSAT on mortality was attenuated in patients who received IV iron. High ferritin was associated with increased risk of death among those who did not receive any IV iron, yet the risk of death was not present in those who received IV iron. Conclusion Our analysis revealed that iron deficiency is associated with mortality risk among patients starting PD treatment. This effect was independent from hemoglobin and restricted to those not treated with IV iron. High ferritin levels were associated with an increased mortality risk in those who did not receive IV iron; there was no association between ferritin and mortality risk in patients who received IV iron. These findings suggest a duality in how ferritin can be an indicator of functional iron deficiency and inflammatory-driven mortality risk, which could have potential implications for managing iron therapy in PD patients.
- Discussion
1
- 10.3747/pdi.2014.00031
- Jun 1, 2014
- Peritoneal Dialysis International: Journal of the International Society for Peritoneal Dialysis
Extremes of body mass index and mortality among Asian peritoneal dialysis patients.
- Research Article
25
- 10.1371/journal.pone.0147070
- Jan 19, 2016
- PLOS ONE
BackgroundProper monitoring for volume overload is important to improve prognosis in peritoneal dialysis (PD) patients. The association between volume status and residual renal function (RRF) remains an unresolved issue. The aim of the present study was to evaluate the association between the edema index and survival or RRF in incident PD patients.Patients and MethodsWe identified all adults who underwent PD. The edema index was defined as the ratio of extracellular fluid to total body fluid. Participants with available data regarding survivorship or non-survivorship during the first year after PD initiation were included in the area under the receiver operating characteristic curve analysis. The cutoff value of the edema index for 1-year mortality was >0.371 in men and >0.372 in women. Participants were divided into two groups according to the cutoff value of their baseline edema indices: High (>cutoff value) and Low (≤cutoff value). Survivors during the first year after PD initiation were divided into two groups according to the initial and 1-year edema index: Non-improvement (maintenance of criteria in the initial Low group during the year) and Other (all participants except those in the Non-improvement group).ResultsIn total, 631 patients were enrolled in the present study. The cutoff value of the edema index for 1-year mortality was >0.371 in men and >0.372 in women. The respective mean initial RRF values (mL·min-1·1.73 m-2) in the Low and High groups, respectively, were 4.88 ± 4.09 and 4.21 ± 3.28 in men (P = 0.108), and 3.19 ± 2.57 and 2.98 ± 2.70 in women (P = 0.531). There were no significant differences between groups in either sex. The respective mean RRF values at 1 year after PD initiation in the Low and High groups, respectively, were 3.56 ± 4.35 and 2.73 ± 2.53 in men, and 2.80 ± 2.36 and 1.85 ± 1.51 in women. RRF at 1 year after PD initiation was higher in the Low group than in the High group (men: P = 0.027; women: P = 0.001). In men, the cumulative 5-year survival rates were 78.7% and 46.2% in the Low and High groups, respectively, whereas in women, rates were 77.2% and 58.8% in the Low and High groups, respectively. For survivors during the first year after PD initiation, the Non-improvement group was associated with a poor survival rate compared with the Other group for both sexes.ConclusionA high edema index was associated with mortality in incident PD patients at baseline and follow-up. The edema index may be used as a new marker for predicting mortality in PD patients.
- Research Article
30
- 10.3390/nu14010144
- Dec 29, 2021
- Nutrients
This study evaluated the association of the serum total cholesterol to high-density lipoprotein cholesterol ratio (TC/HDL-C) with mortality in incident peritoneal dialysis (PD) patients. We performed a multi-center, prospective cohort study of 630 incident PD patients from 2008 to 2015 in Korea. Participants were stratified into quintiles according to baseline TC, HDL-C, LDL-C and TC/HDL-C. The association between mortality and each lipid profile was evaluated using multivariate Cox regression analysis. During a median follow-up period of 70.3 ± 25.2 months, 185 deaths were recorded. The highest TC/HDL-C group had the highest body mass index, percentage of diabetes and serum albumin level. Multivariate analysis demonstrated that the highest quintile of TC/HDL-C was associated with increased risk of all-cause mortality (hazard ratio 1.69, 95% confidence interval 1.04–2.76; p = 0.036), whereas TC, HDL-C and LDL-C were not associated with mortality. Linear regression analysis showed a positive correlation between TC/HDL-C and body mass index. Increased serum TC/HDL-C was an independent risk factor for mortality in the subgroup of old age, female, cardiovascular disease and low HDL-C. The single lipid marker of TC or HDL-C was not able to predict mortality in PD patients. However, increased serum TC/HDL-C was independently associated with all-cause mortality in PD patients.
- Research Article
2
- 10.7759/cureus.19728
- Nov 18, 2021
- Cureus
ObjectiveWhite blood cell (WBC) count was used as a predictor in researches since it is a prognostic indicator and a substantial predictor of the development of cardiovascular disease (CVD). There have been very few reports looking at the association between WBC count and overall mortality in peritoneal dialysis (PD) patients. We intended to explore if the baseline total leukocyte count is linked to all-cause mortality, considering the association for linearity in PD patients.Material and methodsThe study comprised 204 incident PD patients who began treatment at the Nephrology Department of Health Sciences University, Kayseri Medical Faculty, Kayseri City Hospital between January 2009 and December 2017. The research period ended in January 2018. The link between baseline WBC count and all-cause mortality was studied using Cox proportional hazards models.ResultsThe average age of the patients was 46.75 (8.49) years, and 48.5% were male. Diabetes and hypertension were prevalent in 59.8% and 76% of the population, respectively. The average WBC count was 9.37 (2.70) × 103/µL. The mortality risk increased by 23% for every one-unit increase in the crude model. The hazard of death in the fully corrected model was 1.12 [95% confidence interval (CI): 1.02-1.23, p = 0.015]. In the models with WBC count stratified by tertiles, the mortality hazard of patients in tertile 2 was 2.38 (95% CI: 1.24-4.58, p = 0.009) and of patients in tertile 3 in the fully adjusted model was 2.64 (95% CI: 1.30-5.33, p = 0.007), compared with patients in tertile 1.ConclusionThe initial WBC count may have a long-term impact on patient survival. Individuals with higher basal values or even an elevation in follow-up should therefore be strictly controlled, and all preventative measures should be made to lower the risk level.
- Research Article
- 10.1002/dat.20384
- Nov 1, 2009
- Dialysis & Transplantation
Citation: Liao C-T, Chen Y-M, Shiao C-C, et al. Rate of decline of residual renal function is associated with all-cause mortality and technique failure in patients on long-term peritoneal dialysis. Nephrol Dial Transplant. 2009;24:2909–2914. Analysis: Numerous investigators have evaluated the impact of treatment choices on the preservation of residual renal function and whether patients with higher levels of residual renal function (RRF) demonstrate improved patient-centered outcomes. Baseline RRF at the initiation of peritoneal dialysis (PD) is associated with improved overall patient survival and technique survival.1, 2 In the current study, Liao and colleagues provided evidence that rate of decline of RRF from initiation of PD rather than simply the level of RRF at baseline may be a stronger predictor of patient and technique survival. In this retrospective cohort, Liao and coworkers analyzed data on 270 PD patients followed for an average of 45 months, evaluating their rate of RRF decline and determining those variables likely contributing to the more accelerated rate of decline seen in some patients. They demonstrated that patients who had diabetes mellitus, had a history of congestive heart failure (CHF), who were obese, used diuretics, or experienced either peritonitis episodes or hypotensive events demonstrated a faster rate of decline of RRF. Survival analysis correcting for comorbid conditions demonstrated that those patients with the greatest decline in RRF had poorer overall life and technique survivals. Validity and threats to validity: Cohort studies have the benefit of evaluating a diverse group of patients over a long period of time and examining multiple outcomes that result from similar potentially harmful exposures. Furthermore, cohort studies can be used to examine questions regarding harm where randomized controlled trials would typically be either unethical or impractical. These types of studies, however, can be prone to both selection bias (systematic errors in group selection, causing groups to differ with respect to prognosis independent of the exposure or intervention of interest) and confounding (measured or unmeasured factors or interventions that affect the outcome of interest but that are not part of the causal pathway), and careful appraisal of data must be employed to insure accurate interpretations3 and to identify true effect modifiers. This type of study design may be the most practical way to evaluate the question posed by Liao and coworkers, that is, whether the rate of RRF decline from any cause predicts increased risk of early mortality and technique failure in patients with end-stage renal disease (ESRD) treated with peritoneal dialysis. This study evaluates a large population of patients with strict inclusion criteria (thus a well defined cohort), and describes the method the investigators employed to eliminate selection bias through restriction. By excluding patients who transferred out of peritoneal dialysis within 6 months, who had a prior transplant, or who were anuric at enrollment, the authors helped reduce prognostic differences that might have affected outcomes. Although this reduces sample size and power of the study and limits applicability to certain types of patients, it also limits the differences in related baseline characteristics3-5 that might affect outcomes separately from the characteristic or exposure under study. Furthermore, the restrictions meet the test of being biologically sound, since, for instance, it is unlikely that one can evaluate rates of decline in RRF in any patient who does not have residual function at baseline (anuric patients). The study also employs regression analysis and stratification of data, two techniques used to minimize or to expose potential confounders.5 Patients were divided into tertiles based on their rate of renal function decline, analyzed for group differences, and then evaluated using a regression analysis to determine statistically significant factors associated with patient survival and technique survival. These robust statistical analyses are described in detail. The results obtained are plausible and consistent with other epidemiologic and experimental studies. Diabetes, CHF, and older age are well-known risk factors for overall mortality in PD patients, and peritonitis, with its systemic inflammation and potential exposure to nephrotoxic antibiotic therapy, likely contributes to decline in residual renal function. Obesity in patients with low residual function may contribute to problems with middle molecule clearance and eventual modality failure, and hypotensive episodes may cause ischemic injury to the kidneys. The authors addressed issues related to study design limitations and the potential limitations in applicability of a single center cohort. Clinical bottom line: This well-designed, methodologically rigorous retrospective cohort study provides evidence supporting the importance of RRF to patient outcomes, and in particular, that the rate of decline of RRF is a stronger predictor of overall mortality and modality failure than the level of baseline RRF itself. Factors such as diabetes, obesity, CHF, diuretic use, peritonitis, and hypotensive episodes are independent prognostic contributors to the rate of RRF decline. Given the new data reported in this study and in conjunction with the findings from previous studies, it appears prudent to intensify effort to preserve residual function through routine measure of care by aggressively managing volume and nutritional status, by preventing hypovolumia and hypotension, through judicious monitoring for optimal glucose control in diabetes mellitus, and by preventing peritonitis. Whether specific novel interventions such as modulation of the PD fluid composition will result in both preservation of RRF and in improved survival awaits confirmation with a randomized controlled trial. The study by Kim and colleagues below begins to examine the question of novel approaches for the preservation of RRF. Citation: Kim S, Oh J, Kim S, et al. Benefits of biocompatible PD fluid for preservation of residual renal function in incident CAPD patients: a 1-year study. Nephrol Dial Transplant. 2009;24:2899–2908. Analysis: Kim and colleagues performed a randomized controlled clinical trial to determine if the use of a peritoneal dialysis fluid of neutral pH and low (relative to standard PD fluid) in glucose degradation products would result in a better preservation of RRF in incident ESRD patients treated with peritoneal dialysis. This study was performed because previous studies examining this question were inconclusive because of limited power6 or by technique/population heterogeneity.7 The investigators reasoned that improved biocompatibility of the PD fluid would result in decreased inflammation, and this in turn would result in improved survival of RRF, improved technique survival and reduced mortality. The hypothesis tested by Kim and colleagues in this study arises from the following reasoning. In 1995, Maiorca and colleagues described an independent relationship between the presence of RRF and survival of dialysis patients. Their multivariate survival analysis on 102 dialysis patients (both hemodialysis [HD] and PD) demonstrated that for each one mL/min increase in glomerular filtration rate (GFR) in RRF, mortality rates decreased by 40% in the entire group, and 50% in PD patients.8 Similar results were shown in the landmark CANUSA (Canadian/USA Peritoneal Dialysis Study Group) study with 680 incident PD patients.9 Other factors that might impact RRF in incident dialysis patients include dialysis modality,10, 11 strict blood-pressure control,12 renin-angiotensin-aldosterone blockade,5 and avoidance of nephrotoxins.13, 14 Animal studies have shown that exposure to traditional PD fluids, which have a “high” concentration of glucose degradation products (GDP), results in increased serum levels of advanced glycation end products and progressive renal injury due to glomerulosclerosis.15 In vitro biocompatibility studies have shown that a PD fluid that is both acidic and hyperosmolar inhibits the leukocyte phagocytosis and reduces bactericidal activity.16 Data on the effects of PD fluid composition on patient-centered outcomes is incomplete. The current study by Kim and colleagues attempts to correct this deficiency. Their study is a multicenter, open-labeled, randomized, prospective trial comparing outcomes in incident patients on PD treated with low-GDP solutions versus standard PD solutions. Patients were followed for a total of 12 months and GFRs were measured by the mean of renal urea and creatinine clearances. The study protocol was rigorously designed to avoid major common threats to validity including selection bias, differential lost to follow-up, and ascertainment bias. Since the study is unmasked, the probability of the latter was reduced by protocolized determination of hard outcomes including measured GFR, survival, and PD membrane transport properties. After masked allocation, as expected there were no significant differences in age, gender, weight, or etiologies of ESRD between the two treatment arms of the study. The study was powered to see a difference in the primary outcome of changes in or preservation of RRF. Data was analyzed according to intention to treat, and these results were compared with additional post hoc analyses of results “per-protocol” and on the subgroup of study patients with baseline residual function of >2 mL/min/1.73 m2. Patients in both arms of the study were treated identically in all aspects of care other than the PD fluid choice. For the primary outcome of the study, that is, preservation of RRF, there was no statistical significant benefit from use of a low GDP solution in the intention-to-treat analysis. However, in a post hoc analysis, when only patients with a baseline GFR of >2 mL/min were included, use of a low-GDP PD solution resulted in a statistically significant improvement in RRF preservation. The authors argue that such an analysis is indicated since one would expect that only those patients with some baseline residual function would likely benefit from an intervention intended to improve the preservation of RRF. Study patients treated in the low-GDP PD fluid arm of the trial maintained higher serum bicarbonate levels. No statistically significant differences in patient, technique, and peritonitis-free survivals were found between the two groups, but the event rates for these important patient-centered outcomes were low and the study was therefore far from sufficiently powered to address these important issues. A “trend” towards an increase in the rates of peritonitis in the low-GDP PD solution group compared to the standard solution group was not statistically significant. Applicability to U.S. population: The population chosen for this study was heterogeneous in nature and included a diverse adult ESRD population in terms of age, gender, comorbidities, and etiology of ESRD. All patients, however, were culturally and ethnically similar and residents of Taiwan. Furthermore, the study excluded patients on automated PD, a common modality in the United States. Hence the results should be applied to the patient population of North America with some caution as the impact of ethnicity and culture, as well as, the organization structure of healthcare on PD outcomes cannot be determined. Clinical bottom line: The trial by Kim and coworkers provides some support for the hypothesis that low-GDP PD fluid may result in better preservation of RRF in incident PD patients on continuous ambulatory PD during their first year on dialysis. The study was a well-designed, randomized controlled trial with masked allocation but unblinded follow-up. The strengths of the conclusions arising from this study are attenuated by: (1) insufficient power to examine any of the secondary end points, including importantly, risk for mortality, peritonitis, and technique failure; (2) inclusion of patients without RRF in a trial where the primary outcome of interest was preservation of RRF thus, necessitating the as-treated and subgroup analyses; and (3) a relatively brief period of observation. Thus, further randomized trials with a larger number of patients should be conducted to validate these results, with sufficient power to address the patient-centered morbidity and mortality outcomes. Pending further RCT evidence, a systematic review with a meta-analysis of current RCTs, assuming the degree of study differences or heterogeneity is not too large, might provide preliminary evidence of potential benefit or harm with regards to the patient-centered outcomes of mortality, peritonitis rates, and technique survival. Citation: Chan KE, Lazarus JM, Wingard RL, Hakim RM. Association between repeat hospitalization and early intervention in dialysis patients following hospital discharge. Kidney Intl. 2009;76:331–341. Analysis: Chan and collaborators have exploited a clinical database of a large dialysis provider (LDO) to ask the question whether specific aspects of the care of dialysis patients in the outpatient dialysis units, in the period immediately after their discharge from hospitalization, might impact whether these patients are re-admitted quickly to the hospital (within 30 days) after discharge. This study represents a retrospective look at data collected prospectively on all individuals in the LDO cohort. The authors demonstrated four principal findings. First that anemia, mineral-bone disease parameters, and dry weights deteriorated in direct relationship to the length of hospital stay. Second, that simply obtaining the laboratory values for calcium, phosphorus, and parathyroid hormone (PTH) did not appear to impact hospital readmission rates but obtaining a hemoglobin value did. The latter is more likely to translate into changes in erythropoietin (EPO) prescription because of the impact of Medicare reimbursement policies and hence, the management of anemia largely by protocol. Third, that a 16% reduction in risk of repeat hospitalization was observed for those patients in whom the EPO dose was modified in conjunction with a hemoglobin order. As noted by the authors, the hazard ratio for repeat hospitalization changed most steeply in the first 7 days (a 28% risk benefit in the first 14 days versus a 15% risk benefit for the full 30 day period) consistent with a relatively immediate effect of the administration of EPO. Fourth, that resumption of vitamin D was associated with improved re-hospitalization risk independent of whether calcium, phosphorus and PTH were checked. Administering vitamin D reduced the risk by 6% and not resuming vitamin D in those patients previously on vitamin D increased risk by 9%. Similar to EPO, the greatest benefit from resumption of vitamin D also occurred early after hospital discharge. Given that this is an observational study, it is likely that should the same question be interrogated with a randomized controlled clinical trial that the magnitude of demonstrated benefit might be somewhat diminished. Validity and threats to validity: There are two major sources of error that might arise from an observational study such as the study by Chen and colleagues: error introduced by biases such as measurement, ascertainment, and spectrum biases and error arising from confounding. A retrospective look at data, even data that might have been collected prospectively for clinical or non-study purposes is particularly susceptible to errors in identification of exposure. In the current study, the exposure can be defined as whether patients were evaluated for anemia and mineral metabolism abnormalities where such evaluations resulted, either in resumption of or in changes in their vitamin D and ESA orders within 7 days post-discharge. Since the latter are billable events, it is unlikely that there were too many opportunities for failure to record occurrences. It is more likely that ascertainment of hospitalizations, either the case-defining initial hospitalization or the primary study outcome (re-hospitalization) might have been missed in an outpatient dialysis unit generated database. It seems plausible that short, less severe hospitalizations would have been under-recorded. If such under-recording occurred, it is more likely that record keeping might have been most complete in those units exhibiting best practices, where early resumption of outpatient anemia and mineral-bone disease management might have been most likely. Therefore, the likely direction of this bias would have been to diminish the observed magnitude of benefit seen in an early resumption of therapies. The authors attempt to account for these potential biases by performing an analysis that is propensity score-based risk adjusted; a maneuver that can identify and adjust for some of the unit and physician specific differences in care. This method cannot however, eliminate entirely the possibility of confounding, that is, the notion that the benefit associated with early therapy is simply a marker for something else, such as better overall care. The possibility that the results are due to confounding is reduced somewhat by the size and scope of the dataset. Furthermore, the authors have explored this possibility, rigorously using a nested case-control methodology where the concurrent controls are drawn from the same dialysis units/nephrologists as the cases. Since a study such as this cannot eliminate entirely the potential for bias and confounding, the results must be interpreted and applied with caution. Clinical bottom line: The study by Chen and colleagues has particular relevance in light of the discussions regarding healthcare reform including the impact of bundling on ESRD care. It has been recognized for some time that many dialysis-dependent patients demonstrate deterioration in hemoglobin, nutrition, and mineral-bone parameters during prolonged hospitalizations. This study confirms this deterioration. Additionally, this observational study provides some of the most compelling evidence to date that aggressive management of hospitalization-related deterioration in metabolic and clinical parameters in the immediate post-hospitalization period might significantly and favorably impact rehospitalization risk. Any cost savings from currently proposed reforms such as bundling could be significantly attenuated if there were an unanticipated negative impact of reform on aggressive pharmacologic management in the dialysis patient post-hospitalization. Furthermore, it is likely that repeat hospitalization will be associated with increased risk of death further impacting quality of life and cost-effectiveness, an issue that could not be addressed by the current study, but one that is relevant to current national quality of care efforts. This robust study supports the view that clinical assessment immediately post-hospitalization might improve patient outcomes and more specifically, that some of the benefits might be related to EPO and vitamin D therapies. This study does not provide any insights into why EPO and vitamin D therapies might have such an important impact on immediate outcomes. The magnitude of the benefits and the optimal type and timing of dose modification awaits further exploration with randomized controlled trials. Pending RCT confirmation, it seems prudent to consider measures to eliminate barriers to thoughtful therapeutic modification of the care of the dialysis patient within the first 7 days after discharge from hospital.
- Abstract
- 10.1016/j.ekir.2019.05.861
- Jul 1, 2019
- Kidney International Reports
MON-072 FAST PERITONEAL MEMBRANE PERMEABILITY WAS NOT ASSOCIATED WITH MORTALITY IN PATIENTS ON PERITONEAL DIALYSIS
- Research Article
5
- 10.1111/nep.13154
- Sep 24, 2018
- Nephrology (Carlton, Vic.)
Cardiovascular disease is associated with morbidity and mortality in peritoneal dialysis patients but the relationship between left ventricular ejection fraction (LVEF) and outcomes is unclear. This study aimed to explore the association between LVEF and mortality in incident continuous ambulatory peritoneal dialysis (CAPD) patients. The patients were divided into three groups according to LVEF levels (>0.6, 0.5 to 0.6, and <0.5). Kaplan-Meier analysis and the Cox proportional hazards models were used to evaluate association of LVEF with mortality. Among the 594 patients, LVEF levels of >0.6, 0.5 to 0.6, and <0.5 were detected in 428 (72.0%), 127 (21.4%) and 39 (6.6%) patients, respectively. During a median follow-up of 39.6 months, 127 (21.4%) patients died, of the deaths, 57.5% were attributable to cardiovascular causes. Patients with LVEF <0.5 had worst overall rates of survival and cardiovascular death-free survival among groups. Compared with LVEF >0.6, adjusted all-cause mortality hazard ratio (HR) and 95% confidence interval (CI) for patients with LVEF 0.5 to 0.6 and <0.5 were 1.62 (1.09-2.43) and 1.93 (1.06-3.52), respectively. The corresponding adjusted cardiovascular mortality HR were 1.60 (0.94-2.47) and 2.16 (1.04-4.74), respectively. Reduced LVEF is significantly associated with increased all-cause and cardiovascular mortality in incident CAPD patients.
- Research Article
- 10.1038/s41598-025-15473-z
- Aug 12, 2025
- Scientific reports
No study has comprehensively investigated the association between comorbidities, uremic-specific complications (collectively defined as clinical uremic syndrome [CUS]), and mortality in peritoneal dialysis (PD) patients. We conducted a retrospective cohort study including 4,424 incident PD patients from seven centers in China. Comorbidities and complications were each assigned one point: cardiovascular disease, peripheral vascular disease, cerebrovascular disease, diabetes mellitus, hypertension, hyperlipidemia, malnutrition, and anemia. Patients aged 50 years or older received additional points. The total score (CUS score) was calculated to evaluate its association with mortality in PD patients. Over 18,898.4 person-years of follow-up, 1,208 patients (27.3%) died. The median CUS score was 3 (interquartile range [IQR] 2-5; range, 1-11). A nonlinear association between CUS scores and all-cause mortality was observed (nonlinear, p = 0.006). A pre-1point increase in the CUS score was associated with a 1.35-fold increase in the risk of all-cause mortality (95% confidence interval [CI], 1.31-1.39). Compared with patients with CUS scores ≤ 3, those with scores > 3 had a 2.81-fold higher risk of mortality (95% CI, 2.47-3.21). Higher CUS scores were significantly associated with increased all-cause mortality risk in PD patients, particularly those with scores > 3.
- Research Article
1
- 10.1111/1744-9987.14075
- Oct 24, 2023
- Therapeutic Apheresis and Dialysis
To assess the relationship between the rate of residual renal function (RRF) decline in the first year and all-cause and cardiovascular mortality in peritoneal dialysis (PD) patients. Incident PD patients were divided into two groups by the corresponding RRF decline value, when hazard ratio (HR) = 1 was found by the restricted cubic spline. The associations of rate of decline of RRF in the first year with mortality were evaluated. Of 497 PD patients, 122 patients died. After adjusting for confounding factors, patients in fast-decline group had a significant increase risk of all-cause and cardiovascular mortality (HR: 1.97 and 2.09, respectively). Each 0.1-mL/min/1.73 m2 /month decrease in RRF in the first year of PD was associated with a 19% and 20% higher risk of all-cause and cardiovascular mortality, respectively. Faster decline of RRF in the first year was independently associated with all-cause and cardiovascular mortality in PD patients.
- Research Article
3
- 10.1002/dat.20598
- Aug 1, 2011
- Dialysis & Transplantation
Growing a peritoneal dialysis program: A single‐center experience
- Research Article
11
- 10.1016/j.numecd.2020.12.018
- Feb 19, 2021
- Nutrition, Metabolism and Cardiovascular Diseases
Abnormal iron status is associated with an increased risk of mortality in patients on peritoneal dialysis
- Research Article
8
- 10.3389/fcvm.2021.751182
- Nov 3, 2021
- Frontiers in Cardiovascular Medicine
Background: Studies have shown inconsistent associations between serum uric acid (SUA) levels and mortality in peritoneal dialysis (PD) patients. We conducted this meta-analysis to determine whether SUA levels were associated with cardiovascular or all-cause mortality in PD patients.Methods: PubMed, Embase, Web of Science, the Cochrane Library, CNKI, VIP, Wanfang Database, and trial registry databases were systematically searched up to April 11, 2021. Cohort studies of SUA levels and cardiovascular or all-cause mortality in PD patients were obtained. Random effect models were used to calculate the pooled adjusted hazard ratio (HR) and corresponding 95% confidence interval (CI). Sensitivity analyses were conducted to assess the robustness of the pooled results. Subgroup analyses and meta-regression analyses were performed to explore the sources of heterogeneity. Funnel plots, Begg's tests, and Egger's tests were conducted to evaluate potential publication bias. The GRADE approach was used to rate the certainty of evidence. This study was registered with PROSPERO, CRD42021268739.Results: Seven studies covering 18,113 PD patients were included. Compared with the middle SUA levels, high SUA levels increased the risk of all-cause mortality (HR = 1.74, 95%CI: 1.26–2.40, I2 = 34.8%, τ2 = 0.03), low SUA levels were not statistically significant with the risk of all-cause or cardiovascular mortality (HR = 1.04, 95%CI: 0.84–1.29, I2 = 43.8%, τ2 = 0.03; HR = 0.89, 95%CI: 0.65–1.23, I2 = 36.3%, τ2 = 0.04; respectively). Compared with the low SUA levels, high SUA levels were not statistically associated with an increased risk of all-cause or cardiovascular mortality (HR = 1.19, 95%CI: 0.59–2.40, I2 = 88.2%, τ2 = 0.44; HR = 1.22, 95%CI: 0.39–3.85, I2 = 89.3%, τ2 = 0.92; respectively).Conclusion: Compared with middle SUA levels, high SUA levels are associated with an increased risk of all-cause mortality in PD patients. SUA levels may not be associated with cardiovascular mortality. More high-level studies, especially randomized controlled trials, are needed to determine the association between SUA levels and cardiovascular or all-cause mortality in PD patients.Systematic Review Registration: https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42021268739, identifier: CRD42021268739.
- Research Article
- 10.6221/an.2012040
- Dec 1, 2013
BACKGROUND: Cardiovascular (CV) events are among the major causes of mortality and morbidity in dialysis patients. The purpose of this study was to investigate the potential risk factors of CV complications in incident peritoneal dialysis (PD) patients with no underlying coronary artery disease or congestive heart failure before PD.METHODS: The study was performed retrospectively in a hospital-facilitated PD center. A total of 122 adult patients without a known history of coronary heart disease or heart failure were enrolled from January 2005 to December 2008, with an observation period of 3 years. The mean age of the subjects was 55.2 years, and the men-to-women ratio was 53:69. The analyzed variables included biochemical profiles, peritoneal transport rate, cardiothoracic ratio in chest radiography, Kt/V urea, and weekly creatinine clearance indices at baseline and 3 years after PD initiation. The records of prescription with renin-angiotensin-system blockade drugs, glucose-free PD solution dwell, and total glucose exposure were also included in the analysis. Primary outcomes were defined as CV events recorded in the emergency department, in the outpatient clinic, and on hospital admissions.RESULTS: Twenty-two patients had CV events during the study period. There were statistically significant differences in diabetes, hypertension, fasting blood sugar level, and glucose-free PD solution dwell between subjects with and without CV events. Multivariate analysis revealed that higher baseline fasting sugar (≥126 mg/dL) (HR, 3.7; 95% CI, 1.2-11.4) and glucose-free PD solution dwell (HR, 7.3; 95% CI, 1.380-135.8) were risk factors for CV events in incident PD patients.CONCLUSION: The study showed that baseline serum glucose level and glucose-free PD solution dwell were potential risk factors predicting CV events after PD initiation. The results indicated the essential role of sugar and fluid control in PD patients.
- Research Article
10
- 10.1016/j.jacl.2020.01.008
- Jan 25, 2020
- Journal of Clinical Lipidology
Serum lipoprotein(a) and risk of mortality in patients on peritoneal dialysis
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.