Articles published on Cohort Study
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
439452 Search results
Sort by Recency
- New
- Research Article
- 10.1016/j.vaccine.2026.128240
- Mar 7, 2026
- Vaccine
- Alexandra N Paxitzis + 22 more
Antibody responses to SARS-CoV-2 vaccine in nursing home residents support a Bi-annual update schedule.
- New
- Research Article
- 10.1016/j.vaccine.2026.128268
- Mar 7, 2026
- Vaccine
- Andrea J Sharma + 13 more
COVID-19 vaccination during or just prior to pregnancy and hypertensive disorders of pregnancy.
- New
- Research Article
- 10.1186/s41043-025-01193-7
- Mar 5, 2026
- Journal of health, population, and nutrition
- Na Li + 9 more
Atherosclerosis, the primary pathological basis of cardiovascular diseases, exhibits a strong association with glucose metabolism dysregulation. While cross-sectional studies have linked fasting blood glucose (FBG) to atherosclerosis risk, the dose-response relationship and threshold characteristics of long-term FBG trajectories remain poorly characterized. This retrospective cohort study aimed to investigate longitudinal FBG trajectory patterns and their associations with atherosclerosis risk prevalence, incidence, and recovery in Chongqing, China, while also identifying population-specific risk thresholds. Based on the three-year longitudinal follow-up data collected annually from 2017 to 2019, a population-based trajectory model (GBTM) was adopted to identify the dynamic trajectory of FBG. The association between FBG and atherosclerosis risk was analyzed using multivariable logistic regression. Restricted cubic splines (RCS) were used to assess the non-linear relationship between FBG and atherosclerosis risk and to determine risk thresholds. Confounding factors such as age, sex, body mass index (BMI), blood pressure, and lipids were adjusted for in the regression models, and subgroup analyses were performed to examine the interactions of age, sex, and BMI. Longitudinal analysis showed that compared with the Trajectory Normal Glucose Regulation (NGR) group, the Trajectory Prediabetes Mellitus group (Pre-DM) group had significantly higher prevalence (OR: 2.02, 95% CI: 1.63-2.51) and incidence (OR: 1.66, 95% CI: 1.15-2.39) of atherosclerosis risk. The Trajectory Pre-DM group also had a significantly lower likelihood of atherosclerosis risk recovery than the Trajectory NGR group (OR: 0.55, 95% CI: 0.39-0.79). Dose-response analysis revealed a non-linear association between FBG and atherosclerosis risk prevalence, with a risk threshold at 5.10 mmol/L. This suggests that the atherosclerosis risk threshold in Chongqing is significantly lower than the international prediabetes standard of 5.60 mmol/L. Subgroup analyses showed sex and age differences, with lower thresholds in women and younger individuals. Long-term elevation of FBG was associated with increased atherosclerosis risk. The study suggests that intervention strategies should be based on dynamic blood glucose trajectories and population-specific thresholds, especially lower thresholds for women and younger individuals. This study provides evidence-based support for regional atherosclerosis risk prevention and control.
- New
- Research Article
- 10.1001/jamaoncol.2026.0127
- Mar 5, 2026
- JAMA oncology
- Frank P Lin + 33 more
The clinical utility of matching therapies to genomic biomarkers based on varying levels of evidence remains uncertain, particularly for patients with rare and refractory cancers. To assess whether a tiered, evidence-based framework for matching genomic biomarkers to therapies is associated with differential overall survival in patients with advanced solid tumors. This multicenter cohort study was conducted within the Molecular Screening and Therapeutic program, a nationwide precision oncology program in Australia. Patients aged 18 years and older with advanced, refractory solid tumors and adequate Eastern Cooperative Oncology Group Performance Status were enrolled from June 2016 to December 2021, with follow-up through July 2022. Data were analyzed from July 2022 to July 2024. Systemic therapy following comprehensive genomic profiling. Therapies were classified as matched or unmatched using the TOPOGRAPH (Therapy-Oriented Precision Oncology Guidelines for Recommending Anticancer Pharmaceuticals) knowledge base, which stratifies biomarker-drug pairs by level of evidence (tiers 1-3A, prospective trial evidence; tiers 3B/4, investigational/repurposed). The primary outcome was overall survival from date of molecular profiling results. The hypothesis was tested using a time-dependent multivariable Cox proportional hazards model, adjusted for age, Eastern Cooperative Oncology Group Performance Status, cancer type, prior therapy, and prior receipt of matched therapy. Of 3383 patients (mean [SD] age 57.1 [14.3] years; 1792 [53.0%] female), 1270 (37.5%) had a clinically active (tiers 1-3A) biomarker. Among patients with a tier 1 to 3A biomarker receiving treatment, those receiving matched therapy had a longer median overall survival than those receiving unmatched therapy (21.2 months [95% CI, 17.1-26.8 months] vs 12.8 months [95% CI, 11.7-13.9 months]; adjusted hazard ratio [aHR], 0.60; 95% CI, 0.44-0.82; P = .001). In contrast, among patients receiving therapy matched to investigational evidence (tiers 3B/4), there was not an associated survival benefit compared with unmatched therapy (14.5 months [95% CI, 12.6-18.4 months] vs 12.8 months [95% CI, 12.0-14.7 months]; aHR, 1.04; 95% CI, 0.84-1.29; P = .71). Patients who received therapies repurposed from other cancer types based solely on a biomarker and lacking direct evidence (tier 3B) did not experience longer survival compared with those receiving unmatched therapy (13.6 months [95% CI, 8.0-16.8 months] vs 12.5 months [95% CI, 11.3-13.5 months]; aHR, 1.40; 95% CI, 1.00-1.96; P = .047). In this cohort study of patients with advanced solid tumors, matching therapies to genomic biomarkers was associated with improved survival only when supported by prospective clinical trial evidence. These findings support using an evidence-based framework to prioritize genomically guided therapies.
- New
- Research Article
- 10.1017/ice.2026.10411
- Mar 5, 2026
- Infection control and hospital epidemiology
- Zhengxi Chen + 5 more
Clostridioides difficile infection (CDI) requiring colectomy carries substantial mortality risk, with optimal timing of surgery remaining poorly defined. We examined temporal trends in colectomy among inpatients with CDI, identified predictors of surgical intervention and postoperative mortality, and evaluated the association between surgical timing and patient outcomes. A retrospective cohort study was conducted using the National Inpatient Sample database from 2018 to 2022. We compared patients undergoing colectomy with those managed medically. To minimize confounding by hospital-onset cases, the analysis of surgical timing and mortality was restricted to patients undergoing colectomy within 8 days of admission. Predictors were identified using survey-weighted logistic regression and LASSO regression models. Among 240,564 CDI hospitalizations (representing 1,207,995 weighted nationally), 717 patients underwent colectomy (3,585 weighted). CDI prevalence declined from 0.99% (2018) to 0.76% (2022), while colectomy rates increased from 0.28% to 0.34%. Peritonitis (OR 5.42; 95% CI, 4.46-6.59), coagulopathy (OR 4.96; 95% CI, 3.76-6.55), and sepsis/septic shock (OR 3.89; 95% CI, 3.39-4.47) were the strongest predictors of colectomy. Among patients undergoing colectomy within 8 days (2,830 weighted), in-hospital mortality was 26.5% overall, increasing from 21.0% (2018) to 30.7% (2022). Sepsis/septic shock (OR 8.20; 95% CI 2.92-23.07) and coagulopathy (OR 7.27; 95% CI 3.31-15.97) predicted mortality. Each additional day from admission to colectomy was associated with a 16% (OR 1.16; 95% CI 1.04-1.28) increased mortality risk. In this nationally representative cohort, surgical timing was an independent and modifiable determinant of survival in patients with CDI requiring colectomy. Our findings underscore the importance of early surgical consultation for CDI patients with peritonitis, sepsis, and coagulopathy.
- New
- Research Article
- 10.1080/10790268.2025.2566561
- Mar 5, 2026
- The journal of spinal cord medicine
- Benjamin M Abraham + 5 more
Cervical traumatic spinal cord injury (SCI) is a devastating condition that can result in tetraplegia. Early surgical decompression and rehabilitative efforts in cervical SCI patients has been shown to improve neurological outcomes. In this study, we sought to evaluate the impact of various factors at the time of injury and throughout the rehabilitative period on motor functional independence after 1 and 5 years after injury in patients undergoing cervical spinal cord decompression. A longitudinal, retrospective cohort study from the multicenter spinal cord injury Model Systems (SCIMS) database was conducted on patients who presented between 1998 and 2011 with motor Functional Independence Measure (mFIM) scores at rehabilitation admission (RA) and discharge (DC) from inpatient rehabilitation (IPR), year 1, and year 5. Patients who had undergone surgical decompression with neurological levels of injury limited to the cervical region and those with American Spinal Injury Association (ASIA) Impairment Scale (AIS) grades of A and B were included. The mFIM score was utilized to calculate changes in mFIM (ΔmFIM) scores over each respective time period. Multivariable logistical regression was performed to identify longitudinal predictors associated with functional independence controlling for demographics, SCI etiology and level, vertebral bony fracture/dislocation, associated injuries, AIS grade, and discharge disposition. A total of 351 patients were included. A majority were 15-29 years old (53.6%) and 80.1% male. Vehicular-related etiologies (49.9%) were most commonly implicated as the mechanism of cervical SCI. Most patients were AIS grade A at the time of RA (66.4%), with an average time from injury to RA of 19 days (IQR 11-32) for all patients. A total of 131 patients required ventilatory support at the time of RA, and of those, only 12 eventually became FI by year 5. Although the number of patients requiring ventilatory support decreased from 131 to 20 by year 5, no additional patients became FI if they required support at the time of DC or year 1. By 1 year, 43 (12.3%) patients achieved functional independence, and an additional 12 (3.42%) achieved functional independence by year 5. Although those with AIS B injuries (aOR = 5.23, P = 0.0014) and AIS improvement (aOR = 5.14, P = 0.0004) had a greater likelihood of FI by year 1, year 1 FI was more strongly predicted by greater ΔmFIM score during a shorter IPR time period (ΔmFIM score during IPR, aOR = 14.2, P < 0.0001). While AIS grade and AIS improvement were no longer predictive of year 5 FI (P > 0.09 for both), the ΔmFIM score during IPR remained as the strongest predictor towards achieving FI by year 5 (aOR = 23.1, P < 0.0001). Furthermore, a stratified analysis of those patients who did not achieve FI at year 1 revealed that the ΔmFIM score during IPR was an even greater predictor of FI at year 5 (aOR = 53.0, P < 0.0001). A similar relationship was observed where stratification of patients by AIS grade showed that AIS A injuries demonstrated a higher likelihood of 5-year FI due to ΔmFIM score during IPR (aOR = 51.3, P = 0.0002 vs. AIS B: aOR = 42.2, P = 0.0009). Similarly, stratification by need for ventilatory support at RA revealed those who did require ventilatory assistance at RA also had a higher likelihood of 5-year FI due to ΔmFIM score during IPR (aOR = 171, P = 0.035). In patients who suffered an AIS A/B cervical SCI, IPR contributed to achieving functional independence in up to 5 years after the inciting injury. By year 1, 12.3% of patients achieved functional independence, and from years 1 to 5, an additional 3.42% of patients achieved functional independence. Although AIS B patients and patients with AIS improvement had improved outcomes at year 1, only the ΔmFIM score during IPR predicted eventual FI status by year 5. Due to the increased likelihood attaining functional independence at year 5 in AIS A SCI, carries emphasized importance for patients who do not attain functional independence by year 1. Our study highlights the crucial role of early and aggressive rehabilitation following surgical intervention toward ultimate functional independence in traumatic cervical SCI patients enduring complete loss of motor function.
- New
- Research Article
- 10.1093/humrep/deag031
- Mar 5, 2026
- Human reproduction (Oxford, England)
- H Hattori + 3 more
Do clinical and perinatal outcomes of dichorionic diamniotic (DCDA) twin pregnancies differ between single embryo transfer (SET) and double embryo transfer (DET) in human medically assisted reproduction (MAR)? In DCDA twin pregnancies, SET was associated with a significantly higher incidence of complete miscarriage and a lower rate of twin live births than DET. While DET has historically been the major contributor to dizygotic DCDA twins, the global adoption of SET has markedly reduced such cases. However, monozygotic twinning (MZT) occurs more frequently after MAR, especially in blastocyst transfer cycles, and the prognosis of monozygotic DCDA twins remains poorly understood. This single-center retrospective cohort study analyzed 206 clinical multiple pregnancies achieved between January 2014 and December 2024, following 4658 fresh and 15872 frozen-warmed embryo transfer cycles. Only cycles using autologous oocytes were included. Clinical and perinatal outcomes of DCDA twin pregnancies derived from SET and DET were compared. To account for baseline differences between SET and DET groups, an exploratory multivariable logistic regression analysis was performed for clinical outcomes. Statistical analyses were performed using the Mann-Whitney U-test and Fisher's exact test, with P < 0.05 considered significant. When comparing the clinical course of DCDA twin pregnancies, the incidence of two gestational sacs and two fetal heartbeats was significantly higher in the DET group than in the SET group (98.0% vs 47.2%, P < 0.0001; 63.5% vs 25.5%, P < 0.0001) (two fetal heartbeats: adjusted odds ratios [aOR], 0.276; 95% CI, 0.108-0.706; P < 0.007). Twin live birth occurred in 53.1% of DET-derived DCDA twins and 17.6% of SET-derived DCDA twins (P < 0.0001) (aOR, 0.324; 95% CI, 0.121-0.867; P = 0.025), whereas complete miscarriage was more frequent after SET (49.0% vs 17.7%, P < 0.0001) (aOR, 9.140; 95% CI, 3.030-27.600; P < 0.0001). Perinatal outcomes, including gestational age, birth weight, and congenital anomaly rates, did not differ significantly between groups. The number of monozygotic cases was limited, and zygosity could not be genetically confirmed. Some same-sex DCDA twins may have been dizygotic in origin. These findings highlight that DCDA twin pregnancies should not be regarded as a uniform clinical entity in MAR. Even within the same chorionicity category, early outcomes differ significantly between monozygotic twins after SET and dizygotic twins after DET. Although SET remains the optimal strategy to prevent multiple pregnancies, further studies should aim to identify embryos at higher risk of post-transfer splitting and to refine preventive criteria for MZT. There is no funding for this study. N/A.
- New
- Research Article
- 10.1097/ccm.0000000000007086
- Mar 5, 2026
- Critical care medicine
- Ravindranath Tiruvoipati + 6 more
The mortality among patients admitted with sepsis remains high and varies depending on the site of infection. The impact of hypercapnia and acidemia on clinical outcomes in mechanically ventilated patients with sepsis is not well understood. Multicenter, binational, retrospective study assessed the association of compensated hypercapnia, hypercapnic acidemia, and nonrespiratory acidemia, in mechanically ventilated patients with mortality in sepsis. Data were extracted from the "Australian and New Zealand Intensive Care Society Centre for Outcome and Resource Evaluation adult patient" database over a 17-year period (from January 2006 to December 2022) from 201 ICUs. Patients were classified into four mutually exclusive groups based on a combination of arterial pH and arterial Co2 recorded during the first 24 hours of ICU stay: normocapnia with normal pH, fully compensated hypercapnia, hypercapnic acidemia, and nonrespiratory acidemia. Logistic regression and Cox proportional hazards regression were used to examine the association of compensated hypercapnia, hypercapnic, and nonrespiratory academia to hospital mortality. None. Fifty-two thousand four hundred five patients were included. Overall compensated hypercapnia (odds ratio [OR], 1.39; 95% CI, 1.24-1.55; p < 0.001), hypercapnic acidemia (OR, 1.68; 95% CI, 1.57-1.80; p < 0.001), and nonrespiratory acidemia (OR, 1.75; 95% CI, 1.61-1.90; p < 0.001) was associated with increased risk of hospital mortality as compared with patients with normocapnia and normal pH. The risk of increased hospital mortality associated with hypercapnic and nonrespiratory acidemia persisted in all prespecified diagnostic subgroups when compared with patients who had normal pH and normocapnia. Compensated hypercapnia was associated with increased mortality risk in neurologic and unspecified subgroups of sepsis. Hypercapnic acidemia and nonrespiratory acidemia within the first 24 hours of ICU admission are associated with increased risk of hospital mortality in mechanically ventilated patients with sepsis. This association remains consistent in all diagnostic subgroups of sepsis.
- New
- Research Article
- 10.1245/s10434-026-19386-7
- Mar 5, 2026
- Annals of surgical oncology
- Zhan Liu + 5 more
This study aimed to investigate the incidence, characteristics, risk factors, and prognostic implications of myocardial injury after non-cardiac surgery (MINS) in patients with lung cancer undergoing pulmonary resection. We conducted a retrospective analysis of 1314 consecutive patients with lung cancer undergoing elective pulmonary resection between June and November 2023 at a tertiary cancer referral center. Univariate and multivariate logistic regression analyses were used to identify independent risk factors. Kaplan-Meier survival analysis with log-rank tests were adopted to evaluate the 30day mortality and major adverse cardiovascular events (MACE). Subgroup analyses according to the extent of lung resection were also conducted. The overall incidence of MINS following lung cancer surgery was 10.4%. The majority of cases (92.7%) occurred within the first postoperative day and demonstrated predominantly asymptomatic presentation (78.1%). Independent preoperative risk factors for MINS included male sex, coronary artery disease, creatinine, high-sensitivity cardiac troponin T, thoracotomy, lobectomy, and duration of tachycardia. Although MINS showed no association with 30day postoperative mortality, it significantly increased the risk of MACE at 30 days in the overall (7.3% vs. 0.2%, p < 0.001), lobectomy (7.0% vs. 0.3%, p < 0.001), and sublobar resection (9.1% vs. 0, p = 0.002) cohorts. MINS is a common postoperative complication following lung cancer surgery, and typically occurs in the early postoperative period. Although it is predominantly asymptomatic, it was significantly associated with increased 30day MACE.
- New
- Research Article
- 10.4292/wjgpt.v17.i1.112803
- Mar 5, 2026
- World Journal of Gastrointestinal Pharmacology and Therapeutics
- Vedran Tomasic + 8 more
BACKGROUND Increasing age is a major risk factor for colorectal neoplasia, with older adults showing a higher incidence of adenomas compared to individuals under 60 years. Early detection of colonic adenomas and polyps significantly reduces the risk of colorectal cancer. Key quality indicators for colonoscopy include the adenoma detection rate (ADR), polyp detection rate (PDR), and cecal intubation rate (CIR). However, studies comparing these metrics in elderly patients deeply sedated with propofol vs those undergoing colonoscopy without sedation show mixed results. AIM To evaluate deep propofol sedation vs no sedation impact on ADR, PDR, and CIR in elderly patients undergoing screening colonoscopy. METHODS This retrospective cohort study included adults over 60 years who underwent their first screening colonoscopy between January 2017 and September 2023. Exclusion criteria were emergency procedures, inflammatory bowel disease, procedures performed for therapeutic intent, and inadequate bowel preparation [Boston Bowel Preparation Scale (BBPS) score below 6]. Normality was tested by the Kolmogorov-Smirnov test; continuous variables were compared by the Mann-Whitney U test, categorical variables using the χ 2 or Fisher’s exact test. Binary logistic regression identified significant outcome predictors. RESULTS A total of 2034 patients (46.4% female; mean age: 70 years) were included, of whom 622 (30.6%) underwent colonoscopy under deep sedation. The overall PDR was 51.65%, ADR was 33.3%, and CIR was 94.25%. After adjusting for confounders [age, sex, body mass index (BMI), BBPS, operation, and diverticulosis], no significant differences were observed in PDR (51.8% vs 51.5%), ADR (33.5% vs 32.5%), or CIR (93.2% vs 95.3%) between the no-sedation and deep-sedation groups. Higher BMI (B = 0.96, P < 0.01) and male sex (B = 0.64, P < 0.01) were independent predictors of higher ADR. CONCLUSION In this elderly cohort, propofol-induced deep sedation did not significantly improve ADR, PDR, or CIR. Further research is warranted to clarify its effect on colonoscopy quality metrics in older populations.
- New
- Research Article
- 10.1097/ccm.0000000000007088
- Mar 5, 2026
- Critical care medicine
- Zhaofeng Kang + 5 more
Sepsis triggers both excessive inflammation and immunosuppression, the latter partly characterized by CD4+ T-cell depletion. The mechanisms underlying this depletion, especially its interplay with cytokine storms driven by inflammatory factors such as interleukin (IL)-6, remain unclear. This study aimed to elucidate the molecular mechanisms contributing to CD4+ T-cell depletion in sepsis, focusing specifically on the IL-6/Janus kinases (JAKs)/signal transducer and activator of transcription 3 (STAT3) signaling axis and programmed cell death. Prospective cohort study. Adult ICUs at a university hospital. Adult sepsis and septic shock patients without any documented immune comorbidity. None. A prospective analysis enrolled 151 patients (93 sepsis, 58 septic shock) and 20 controls. Flow cytometry and enzyme-linked immunosorbent assay were used to assess immune cell populations and cytokine profiles, with multivariate analyses exploring their interrelationships. An additional 30 sepsis patients and ten controls were recruited to investigate mechanisms. Peripheral blood mononuclear cells (PBMCs) underwent RNA sequencing (RNA-seq). Isolated CD4+ T cells were stimulated with IL-6 in vitro, followed by treatment with specific inhibitors targeting pyroptosis, apoptosis, necroptosis, the JAKs/STAT3 pathway, or receptor-interacting protein kinase 1 (RIPK1). Western blotting, flow cytometry, immunofluorescence, Cell Counting Kit-8 assays, and interferon-gamma staining evaluated cell death pathways, PANoptosome (a complex mediating apoptosis, pyroptosis and necroptosis)-assembly, and function. Significant CD4+ T-cell loss occurred in both sepsis and septic shock groups, strongly correlating with elevated IL-6 levels. Sepsis PBMC RNA-seq revealed activated IL-6/JAKs/STAT3 signaling and upregulated apoptosis/pyroptosis/necroptosis genes. In vitro, IL-6 induced pyroptosis, apoptosis, and necroptosis (PANoptosis) in CD4+ T cells via IL-6/JAKs/STAT3-dependent RIPK1-PANoptosome assembly. Inhibiting JAKs/STAT3 or RIPK1 significantly reduced PANoptosis, partially restored CD4+ T-cell viability and functional capacity. PANoptosis has been observed to be a form of CD4+ T-cell death in sepsis patients. Evidence suggests that IL-6 may be associated with the exhaustion process, mechanistically involving the activation of the JAKs/STAT3 pathway. It is also hypothesized that this process might be linked to RIPK1-PANoptosome-mediated PANoptosis.
- New
- Research Article
- 10.1177/19433654261424879
- Mar 5, 2026
- Respiratory care
- Wenting Zhang + 9 more
Postoperative pulmonary complications (PPCs) are a major cause of morbidity and mortality in surgical patients, particularly among the critically ill. Several risk prediction models have been developed to stratify the risk of PPCs. However, comparative evidence on theirperformance in critically ill populations remain scarce. In the present retrospective cohort study, 495 critically ill surgical subjectswho were admitted to a tertiary hospital ICU in China were assessed for PPCs using 3 established risk models:Local Assessment of Ventilatory Management During General Anesthesia for Surgery (LAS VEGAS), Assess Respiratory risk in Surgical Patients in Catalonia (ARISCAT),and Chinese Brief Predictive Risk Index (CHI-BPRI). Inclusion required intra-operative mechanical ventilation and postoperative ICU care. Then, predictive performance was evaluated by receiver operating characteristic analysis, with discrimination quantified using the area under the receiver operating characteristic curve (AUC), sensitivity, specificity, and odds ratio. In the cohort, 20.4% developed PPCs. The LAS VEGAS score had the highest AUC (0.63, 95% CI 0.57-0.68), with 74% sensitivity and 47% specificity. However, the ARISCAT and CHI-BPRI scores had lower AUCs of 0.60 and 0.55, respectively. Despite the numerical differences, none of the scores achieved statistically superior discrimination (P = .44, .058, .16), and all AUCs fell below 0.70, indicating suboptimal predictive accuracy. The LAS VEGAS score had the strongest association with PPCs (OR 2.26, 95% CI 1.42-3.60). In critically ill Chinese surgical subjects, the LAS VEGAS, ARISCAT, and CHI-BPRI scores had limited predictive performance for PPCs, and none of the scores achieved strong discriminative power. Among these, the LAS VEGAS score performed best and may offer modest utility for early risk identification. These findings underscore the need for improved, population-specific prediction tools tailored to the unique risk profiles of critically ill surgical patients.
- New
- Research Article
- 10.1186/s12887-026-06674-0
- Mar 5, 2026
- BMC pediatrics
- Tafesse Gizaw + 6 more
Time to treatment failure and its predictors among children attending antiretroviral therapy in Jimma, southwest Ethiopia: a retrospective cohort study.
- New
- Research Article
- 10.1080/00016489.2026.2638513
- Mar 5, 2026
- Acta oto-laryngologica
- Fazıl Necdet Ardıç + 5 more
Stapes surgery using Teflon prostheses is widely performed for otosclerosis; however, prosthesis dislocation has been reported in approximately 3.5% of cases within the first two postoperative years. To evaluate the seven-year audiological and patient-reported outcomes of stapedotomy with bone cement fixation. This retrospective cohort study with prospective follow-up included patients who underwent stapedotomy with bone cement fixation between 2012 and 2022. Air conduction and bone conduction thresholds were assessed at short-term (0-1 years), mid-term (1-4 years), and long-term (4- years) follow-up. Patient-reported outcomes were evaluated using a procedure-specific questionnaire and the Glasgow Benefit Inventory (GBI). Postoperative ABG improved significantly (p < 0.001) and remained stable during long-term. Progressive elevation in bone conduction thresholds was observed in both operated and contralateral ears. The mean GBI score was 47.5 ± 34.2, with 89% of patients reporting moderate to substantial benefit. Perceived benefit and willingness to recommend surgery were negatively associated with hospitalization difficulties, hearing aid use, and lifestyle limitations, and positively associated with tinnitus improvement and higher GBI scores. Bone cement fixation in stapedotomy provides durable long-term hearing outcomes without adverse effects. Patient-perceived benefit is primarily determined by stable ABG closure and limited progression of sensorineural hearing loss.
- New
- Research Article
- 10.4292/wjgpt.v17.i1.112872
- Mar 5, 2026
- World Journal of Gastrointestinal Pharmacology and Therapeutics
- Linda Yun Zhang + 11 more
BACKGROUND Biliary fully-covered self-expanding metal stents (FCSEMS) are increasingly used over plastic or uncovered SEMS due to their long-term patency and removability. However, there is concern that transpapillary placement may lead to post-endoscopic retrograde cholangiopancreatography (ERCP) pancreatitis (PEP). AIM To assess the rates of PEP with and without the use of FCSEMS for both benign and malignant indications. METHODS We performed a multicenter retrospective cohort study involving three Australian tertiary referral centers. Consecutive adults who underwent ERCP for biliary indications between October 2016 and October 2019 were included. The primary endpoint was the rate of PEP. Secondary endpoints included severity of pancreatitis, other procedure- and stent-related adverse events occurring within 90 days. RESULTS A total of 3401 ERCPs were performed (54.2% female, mean age 62.9±18.6 years) with an overall PEP rate of 3.15%. On propensity-score matched analysis, FCSEMS was an independent predictor of PEP (odds ratio = 5.49, 95% confidence interval: 2.10-6.99; P = 0.001). FCSEMS had a higher rate of PEP (7.8%) compared with plastic stents (3.4%; P = 0.0015), and patients who did not receive any stents (2.4%; P = 0.001), but was non-significant when compared with uncovered self-expanding metal stents (3.9%; P = 0.12). The rate of PEP following FCSEMS decreased to 6.0% for malignant indications, and further to 3.9% for biliary obstruction due to pancreatic cancer, but did not reach statistical significance. CONCLUSION Biliary FCSEMS are associated with a higher risk of PEP compared to no stents or plastic stents, particularly for benign indications.
- New
- Research Article
- 10.3389/fendo.2026.1766149
- Mar 4, 2026
- Frontiers in Endocrinology
- Chong Yan + 1 more
Background and Objective Diabetic peripheral neuropathy (DPN) is a prevalent and debilitating complication of type 2 diabetes mellitus (T2DM). Although glycated hemoglobin (HbA1c) is a primary metric for glycemic control, many patients develop or experience progression of DPN despite achieving HbA1c targets, suggesting the importance of other dynamic glycemic parameters. Glycemic variability (GV) may contribute to nerve injury via mechanisms such as oxidative stress, inflammation, and neurotrophic factor dysregulation. However, clinical evidence linking GV to DPN remains inconsistent, and rigorous studies controlling for confounders are scarce. This study aimed to determine whether GV is independently associated with DPN beyond HbA1c in a propensity score-matched (PSM) cohort and to explore the potential mediating roles of inflammatory cytokines and neurotrophic factors. Methods This single-center retrospective cohort study screened T2DM patients hospitalized between January 1, 2020, and December 31, 2024. Patients with complete 72-hour continuous glucose monitoring (CGM) data and bilateral nerve conduction studies (NCS) were included. DPN was diagnosed according to the Chinese Diabetes Society guidelines. Propensity score matching (PSM, 1:1, caliper=0.02) was used to balance the DPN and non-DPN groups on age, sex, BMI, diabetes duration, HbA1c, systolic blood pressure, LDL-C, and estimated glomerular filtration rate. Primary outcomes included GV parameters (mean amplitude of glycemic excursions [MAGE], coefficient of variation [CV], standard deviation [SD]) and a composite nerve conduction velocity (NCV) Z-score. Serum inflammatory cytokines (IL-6, TNF-α) and neurotrophic factors (NGF, IGF-1) were measured in a nested subcohort. Data were analyzed using multivariable linear regression, dose-response analysis, causal mediation analysis, and receiver operating characteristic (ROC) curve analysis. Results After PSM, 256 well-matched patients (128 in each group) were included, with excellent covariate balance (all standardized mean differences &lt;0.1). GV parameters (MAGE, CV, and SD) remained significantly higher in the DPN group compared to the non-DPN group after matching (all P &lt; 0.001). Within the DPN group, stratification by MAGE tertiles revealed a clear dose-response relationship: higher MAGE tertiles were associated with progressively worse composite NCV Z-scores (P for trend &lt;0.001). Subgroup analysis (n=160) showed that higher MAGE tertiles were associated with elevated IL-6 and TNF-α levels and decreased NGF levels (P for trend &lt;0.05). Multivariable linear regression confirmed MAGE (β = -0.38, P &lt; 0.001) and CV (β = -0.31, P &lt; 0.001) as independent negative predictors of NCV after adjusting for confounders including HbA1c. Mediation analysis indicated that IL-6 and TNF-α collectively mediated approximately 32% of the negative effect of MAGE on NCV (indirect effect β = -0.12, P &lt; 0.001). ROC curve analysis identified optimal GV thresholds for discriminating DPN: MAGE ≥5.8 mmol/L (AUC = 0.84, sensitivity 76%, specificity 79%) and CV ≥32.5% (AUC = 0.81, sensitivity 72%, specificity 77%). Conclusion In this propensity score-matched cohort study, higher glycemic variability is independently and robustly associated with the presence and severity of diabetic peripheral neuropathy in patients with T2DM, even after accounting for HbA1c and other conventional risk factors. This association exhibits a dose-response relationship and is partially mediated by systemic inflammation. Our findings advocate for incorporating GV assessment into clinical practice for better DPN risk stratification and suggest that therapeutic strategies aimed at reducing glycemic variability may offer additional neuroprotective benefits.
- New
- Research Article
- 10.1136/archdischild-2025-329495
- Mar 4, 2026
- Archives of disease in childhood. Fetal and neonatal edition
- Hannah Farley + 4 more
To describe changes in early respiratory support for infants born at <30 weeks' gestational age (GA) in England and Wales. Retrospective cohort study using data from the National Neonatal Research Database of all infants born at <30 weeks GA, admitted to neonatal units in England and Wales from 2016 to 2021. Methods of respiratory support used in the delivery room and days 1 and 7 of care were determined. Success of the initial non-invasive respiratory support strategy was assessed by any use of mechanical ventilation in the first 7 days of care. 24 107 babies were included. Use of continuous positive airway pressure (CPAP) and high-flow nasal cannula (HFNC) as the highest method of respiratory support for stabilisation increased during the study period (CPAP: 17.3% to 28.8%; HFNC: 0% (first recorded in 2016) to 0.7%). CPAP use increased in the most preterm (<25 weeks GA; 0.7% to 4.8%), the extremely preterm (<28 weeks GA; 7.2% to 17.5%) and the very preterm (28-29 weeks GA; 29.3% to 44.1%) cohorts. Among those initially stabilised with non-invasive ventilation in this study, 2763 (48.0%) infants required mechanical ventilation in the first week. In England and Wales, use of non-invasive respiratory support for initial stabilisation has increased among babies born at <30 weeks GA. 48% of those stabilised with non-invasive ventilation required mechanical ventilation in the first week. A higher quality evidence base for interventions that reduce mechanical ventilation could improve respiratory management in this population.
- New
- Research Article
- 10.1001/jamanetworkopen.2026.0461
- Mar 4, 2026
- JAMA network open
- Elijah Mak + 13 more
Sex differences are increasingly recognized as modifiers of Alzheimer disease and related dementias, with women exhibiting greater tau burden and faster cognitive decline than men. Even though α-synuclein copathology frequently occurs in Alzheimer disease, its contribution to sex differences in disease progression is unclear. To test whether α-synuclein positivity, measured using cerebrospinal fluid seed amplification assay (SAA), is differentially associated with tau accumulation in women vs men across the Alzheimer disease continuum. This cohort study used longitudinal tau positron emission tomography from the Alzheimer's Disease Neuroimaging Initiative collected between 2015 and 2023, with a median (IQR) follow-up of 1.23 (0.00-3.84) years. Participants were stratified by cerebrospinal fluid α-synuclein seed amplification assay status and sex. Participants were cognitively unimpaired or cognitively impaired (mild cognitive impairment or dementia) at baseline. Cerebrospinal fluid α-synuclein status determined by SAA and dichotomized as SAA negative or SAA positive. Tau burden was quantified as standardized uptake value ratio (SUVr) in the medial temporal composite region of interest. Linear mixed-effects models tested SAA by sex by time interactions on longitudinal tau accumulation, adjusting for baseline age, baseline cognitive status, apolipoprotein E ε4 carrier status, and site. Sample size estimates were calculated to detect 25% and 50% treatment effects with 80% power in those with cognitive impairment. Among 415 participants (mean [SD] age, 72.3 [7.6] years; 220 women [53%]; 69 SAA positive [17%] and 346 SAA negative [83%]), there was a significant interaction between SAA status, sex, and time on tau accumulation (β, 0.061; 95% CI, 0.030-0.093; P < .001). Women with positive SAA results exhibited the fastest tau accumulation compared with other groups (0.066 SUVr per year; 95% CI, 0.043 to 0.089 SUVr per year; P < .001). Clinical trials targeting tau pathology in cognitively impaired individuals with 18-month follow-up would require 129 SAA-positive women to detect a 25% treatment effect with 80% power, compared with 518 SAA-negative women. In this cohort study of participants across the Alzheimer disease continuum, α-synuclein copathology was associated with faster tau accumulation in women than men. These findings may inform sex-specific interpretation of α-synuclein biomarkers and trial design.
- New
- Research Article
- 10.5603/pjnns.109467
- Mar 4, 2026
- Neurologia i neurochirurgia polska
- Kinga Glądys + 4 more
We examined whether cellular fibronectin (cFn) is associated with thromboembolic events or bleeding in anticoagulated patients with atrial fibrillation (AF). Up to 5% of anticoagulated AF patients experience major bleeding, and identifying those at elevated risk remains challenging. Cellular fibronectin, a marker of endothelial dysfunction and vascular injury, has been linked to altered prothrombotic fibrin clot features in atherosclerosis, stroke, and venous thromboembolism, but its levels and role in AF are unknown. In this cohort study, we enrolled 185 consecutive AF patients on rivaroxaban aged 70.0 [interquartile range (IQR) 62.0-76.0] years with a median CHA₂DS₂-VASc of 3.0 (IQR 1.0-4.0). We determined plasma cFn along with clinical and laboratory parameters, including fibrin clot permeability. During a median follow-up of 49.0 (IQR 46.0-51.0) months, we recorded both clinically relevant major and nonmajor bleeding according to the International Society on Thrombosis and Haemostasis (ISTH) and a composite ischemic endpoint involving ischemic cerebrovascular events or cardiovascular (CV) death. Plasma cFn (median 3.5, IQR 2.7-4.5 μg/mL) correlated positively with fibrinogen and C-reactive protein (CRP), and inversely with clot permeability. During follow-up, bleeding occurred in 25 patients (13.5%, 3.55/100 patient-years), including major bleeding in 13 (7.0%, 1.78/100 patient-years). Patients with cFn in the lowest quartile ( < 2.7 μg/mL) had an almost 4-fold higher risk of bleeding compared to those in the highest quartile ( > 4.5 μg/mL) [hazard ratio (HR) 3.85, 95% confidence interval (CI): 1.06-14.00]. On multivariate analysis adjusted for age and sex, low cFn was an independent predictor of bleeding and major bleeding, but not of the ischemic endpoint. Our results suggest that low cFn levels might help identify patients with AF at increased bleeding risk during long-term anticoagulation, and this association could partly be related to the formation of looser fibrin networks.
- New
- Research Article
- 10.1093/ehjimp/qyag040
- Mar 4, 2026
- European Heart Journal - Imaging Methods and Practice
- Nadim Nasrallah + 15 more
Abstract Aims People with HIV (PWH) and undetectable virus experience elevated cardiovascular risk independent of traditional risk factors. Vascular inflammation may contribute to this residual risk. The perivascular fat attenuation index (FAI), derived from coronary computed tomography angiography (CCTA), is a biomarker of coronary inflammation. Lipoprotein(a) [Lp(a)] carries oxidized phospholipids that may promote inflammation. Statins have demonstrated cardiovascular benefit in PWH, including pleiotropic anti-inflammatory effects. This study assessed the associations of Lp(a) and of statin use with coronary inflammation (FAI) in men with HIV (MWH). Methods We analyzed FAI of the left anterior descending (LAD) and the right coronary arteries (RCA) in 583 men from the Multicenter AIDS Cohort Study, a prospective, multicenter cohort study, including 280 with undetectable HIV RNA, &lt;50 copies/mL. Associations between log₁₀[Lp(a)] and LAD and RCA FAI were assessed using linear regression adjusting for demographic and cardiovascular risk factors. Results Log₁₀[Lp(a)] was associated with LAD FAI in MWH with undetectable HIV in adjusted analysis [+1.99 HU (0.38, 3.59); p = 0.02]but not among men without HIV (MWoH) or MWH with detectable HIV. Associations with RCA FAI were only significant in unadjusted analysis. Statin use was associated with lower FAI, less inflammation, in the LAD in MWH with undetectable virus but did not modify the association between Lp(a) and coronary inflammation. Conclusions Lp(a) was associated with increased coronary inflammation, independent of traditional cardiovascular risk factors, in MWH with undetectable virus. Statin therapy did not modify the relationship between coronary inflammation and Lp(a).