Published in last 50 years
Articles published on Risk Of Infection
- New
- Research Article
- 10.1161/circ.152.suppl_3.4364919
- Nov 4, 2025
- Circulation
- Bahy Abofrekha + 6 more
Background: Antiphospholipid syndrome (APS) predisposes patients to thrombosis and cardiac valve lesions (e.g., Libman-Sacks endocarditis). These vegetations, though sterile, may serve as a nidus for infection. The risk of infective endocarditis (IE) and other serious infections in APS patients within large populations remains poorly quantified, representing a key knowledge gap. Research Questions/Hypothesis: To quantify the risk of the primary outcome, IE, and secondary outcomes of MRSA sepsis and MSSA sepsis, associated with APS using a large, nationally representative inpatient database. Methods/Approach: This retrospective cross-sectional study utilized the Nationwide Inpatient Sample (NIS) database from 2016 to 2020. Hospitalized patients aged 18-75 with APS were compared to those without APS. Patients with major pre-existing risks for IE or significant confounders (e.g., prosthetic valves, specific congenital/rheumatic heart diseases, ESRD) were excluded. Multivariable logistic regression was used to calculate adjusted odds ratios (aORs) with 95% confidence intervals (CIs), adjusting for age, sex, race/ethnicity, hospital region, primary payer, median household income, and Systemic Lupus Erythematosus (SLE) status. Results/Data: A total of 297,459 patients met inclusion criteria; 223 hospitalizations (0.075%) had an APS diagnosis. APS patients were significantly younger (mean age 46.9 ± 14.4 vs. 50.8 ± 14.6 years, p<0.001) and more often female (72.6% vs. 40.0%, p<0.001). Unadjusted analyses revealed higher IE prevalence in APS (8.5% vs. 4.3%, p = 0.002), MRSA sepsis (14.3% vs. 8.9%, p = 0.004), and MSSA sepsis (15.2% vs. 9.6%, p = 0.004). In multivariable analysis, APS was significantly associated with over double the odds of IE (aOR 2.03; 95% CI 1.22–3.37; p = 0.007). APS also conferred increased risks of MRSA sepsis (aOR 1.75; 95% CI 1.18–2.58; p=0.005) and MSSA sepsis (aOR 1.86; 95% CI 1.28–2.70; p=0.001). In-hospital mortality within the IE cohort was not significantly different (0.2% vs. 0.1%, p = 0.543). Conclusion(s): APS emerged as a significant independent risk factor for IE, MRSA, and MSSA sepsis in this nationwide analysis. These findings suggest a broader vulnerability to infection in APS, highlighting the critical need for increased clinical suspicion, vigilant monitoring, and potentially tailored prophylactic or treatment approaches for severe infections in these patients.
- New
- Research Article
- 10.1161/circ.152.suppl_3.4370764
- Nov 4, 2025
- Circulation
- John Weng + 3 more
Introduction: Sudden changes in ventricular cycle lengths (or short-long-short [S-L-S] sequences) can initiate polymorphic ventricular tachycardia (VT). This pattern involves a premature ventricular beat (PVC), followed by a compensatory pause and subsequent premature beat. The compensatory pause leading to the long cycle is followed by a prolonged repolarization phase. Suppressing S-L-S sequences with anti-bradycardic pacing may prevent polymorphic VT in at-risk patients. Herein we describe a case of a patient being treated for fungal endocarditis who developed recurrent VT storm in whom anti-bradycardia pacing via a single-chamber right atrial pacemaker prevented further sustained VT episodes. Case: A 69-year-old patient with fungal bioprosthetic aortic valve endocarditis developed sustained VT, for which he was successfully resuscitated. He was prescribed a secondary prevention wearable defibrillator lifevest on discharge, but over a 2-week period the patient received 6 appropriate shocks by his wearable defibrillator for polymorphic and monomorphic VT episodes despite treatment with multiple anti-arrhythmic agents. In-patient telemetry identified S-L-S sequence preceding the VT episodes. Subsequently, a single chamber right atrial leadless pacemaker was placed and programmed to pace at eighty beats per minute. At 8 weeks post-implant, he experienced no further VT episodes (non-sustained or sustained). Discussion: Due to ongoing treatment for fungemia, placement of a transvenous pacing/defibrillator system was a relative contraindication, due to high risk for cardiac device infection. While a subcutaneous ICD could have been implanted with less infectious risk, it does not have pacing function, and would not have prevented further VT episodes. Conclusion: This case highlights a unique management strategy utilizing anti-bradycardia pacing via a single chamber right atrial leadless pacemaker to prevent ventricular arrhythmias triggered by S-L-S sequences when other treatment options were less plausible.
- New
- Research Article
- 10.1161/circ.152.suppl_3.4370578
- Nov 4, 2025
- Circulation
- Valerie Vilarino + 3 more
Background: Leadless pacemakers offer advantages in immunosuppressed patients or those with limited venous access. This case highlights the novel use of a dual-chamber leadless pacemaker (DC-LP) in a patient with a complex history: orthotopic heart transplant (OHT), kidney transplant (KT), and now end-stage renal disease (ESRD) on intermittent hemodialysis (iHD) via a left arm arteriovenous fistula (AVF). The procedure was complicated by transient atrial non-capture after atrial lead deployment. Description of Case: A 31-year-old male with a history of parvovirus myocarditis status-post OHT and prior KT—complicated by multiple graft rejections, now on iHD for ESRD via a left arm AVF, presented with bradycardia and hypotension during iHD. He denied chest pain or syncope but endorsed exertional fatigue. ECG revealed complete heart block with a junctional escape rhythm and AV dissociation. Two distinct P-wave morphologies suggested dual atrial activity from residual native tissue and donor sinus node. Echocardiogram showed normal LV systolic function (EF 60–65%), mild to moderate RV dilation with mildly reduced function, and elevated RV systolic pressure. Coronary angiography showed no allograft vasculopathy, biopsy revealed mild (1R) rejection. Dopamine transiently improved sinus rate and AV conduction. Due to persistent chronotropic incompetence, a DC-LP was implanted. The decision was driven by his KT history, current iHD, and left arm AVF – factors rendering venous preservation and infection risk reduction critical. Following atrial lead deployment, transient atrial non-capture was noted but resolved spontaneously, likely due to local tissue effects at the implant site. Final programming was AAI70/VVI40. Discussion: This is, to our knowledge, the first reported DC-LP case in a patient with both OHT and KT. His complex profile—including immunosuppression, failed renal graft, left arm AVF, and ESRD—made him ideal for leadless pacing. DC-LP minimizes infection risk, preserves venous access, and avoids lead-related complications. The transient atrial non-capture likely related to local tissue injury, resolved spontaneously, underscoring the modality’s reliability in complex cases. Conclusion: DC-LP is a feasible and beneficial option for transplant recipients. This case demonstrates the successful DC-LP use in a patient with OHT, KT, iHD dependence, and left arm AVF, while emphasizing the importance of recognizing post-implant pacing anomalies that may self-resolve.
- New
- Research Article
- 10.1161/circ.152.suppl_3.4366012
- Nov 4, 2025
- Circulation
- Siya Bhagat + 4 more
Introduction: Advanced heart failure in incarcerated individuals presents unique clinical, ethical, and logistical challenges. Mechanical circulatory support (MCS), including left ventricular assist devices (LVAD), offers life-saving therapy for patients ineligible for heart transplant. However, access remains limited in the prison system due to systemic healthcare disparities. This case highlights the complexity of managing cardiogenic shock in incarcerated patients and barriers to equitable advanced heart failure care with MCS. Case Presentation: A 57-year-old incarcerated male with non-ischemic cardiomyopathy presented with abdominal pain, dyspnea, and atrial fibrillation with rapid ventricular response. Exam revealed hypotension, cold extremities, and elevated jugular venous pressure. Workup showed a renal infarct (presumed cardioembolic), EF 8%, biventricular dysfunction, and cardiogenic shock despite cardioversion and rhythm control requiring norepinephrine (0.5 mcg/kg/min), epinephrine (0.1 mcg/kg/min), and dobutamine (2.5 mcg/kg/min). Though initially ineligible for transplant or LVAD due to incarceration and prior substance use, he was temporarily stabilized with an Impella 5.5. He was transitioned to milrinone with clinical improvement and discharged to his correctional facility with plans for reassessment for advanced options following expected imminent prison release. Unfortunately, he was readmitted with recurrent cardiogenic shock and MRSE bacteremia. Following stabilization with dual inotropes and aggressive afterload reduction as well as completion of antibiotic therapy, he underwent successful LVAD placement. Discussion: This case underscores the barriers in providing durable LVAD therapy to incarcerated patients. Coordination of care is limited by the constraints of the correctional health system, including staff education, restricted access to specialized centers, and difficulty with device maintenance such as hygiene, battery care, and emergency care in case of device malfunction. Incarceration also limits social support, impacts adherence, and contributes to psychological stress. Infection risk and security concerns further complicate care. Addressing these challenges are essential to ensure that incarcerated individuals receive equitable access to advanced heart failure therapies and thus facilitate optimal outcomes.
- New
- Research Article
- 10.1161/circ.152.suppl_3.4341515
- Nov 4, 2025
- Circulation
- Hector Santos Argueta + 7 more
Background: Cardiac implantable electronic devices (CIEDs) are increasingly used to manage various cardiac conditions but are linked to a higher risk of severe infections, including bacteremia and infective endocarditis (IE). These infections often lead to longer hospital stays, higher mortality rates, and more complex clinical management. Understanding the influence of CIEDs on outcomes in these conditions is critical for optimizing treatment and informing clinical decisions, particularly regarding device removal. Methods: We utilized the National Inpatient Sample (NIS) 2016–2019 to identify adult patients hospitalized with either a primary diagnosis of bacteremia or IE. For both cohorts, patients were stratified by the presence or absence of a CIED, identified using ICD-10 codes. The primary outcome was inpatient mortality, and the secondary outcome was length of stay (LOS). Multivariate regression analyses were performed to adjust for potential confounders. Results: Among 671,334 patients with bacteremia, 39,875 (1%) had CIEDs and 631,459 (99%) did not. In the CIED group, 1,270 patients (3.1%) died, with an adjusted odds ratio (OR) of 0.86 (95% CI 0.75–0.98; p = 0.03) compared to the non-CIED group. CIED removal was not significantly associated with improved mortality (OR 0.2, 95% CI 0.028–1.54). The average LOS for patients with bacteremia was 5 days, but those with CIEDs had significantly longer stays (Coefficient 2.67, 95% CI 2.44–2.90). Among 229,470 patients with IE, 15,840 (7%) had CIEDs and 213,630 (93%) did not. Mortality was significantly higher in the CIED group at 7.4% (1,184 deaths), with an OR of 2.51 (95% CI 2.17–2.91). Notably, CIED removal was associated with significantly lower mortality (OR 0.52, 95% CI 0.44–0.63). The average LOS for patients with CIEDs and IE was 10.5 days, significantly longer than for non-CIED patients (Coefficient 5.2, 95% CI 4.8–5.5). Conclusion: Our analysis reveals a nuanced relationship between CIEDs and patient outcomes in bloodstream infections. While patients with CIEDs and bacteremia had lower inpatient mortality despite longer hospital stays, those with CIEDs and IE experienced significantly higher mortality and longer hospitalization. Importantly, CIED removal was associated with a survival benefit in IE but not in bacteremia. These findings underscore the need for infection-specific approaches to managing patients with CIEDs, particularly concerning the timing and role of device removal to improve patient outcomes.
- New
- Research Article
- 10.1097/ms9.0000000000004248
- Nov 4, 2025
- Annals of Medicine & Surgery
- Muhammad Khizar + 4 more
Severe cranial defects caused by trauma, tumor resection, or congenital anomalies remain challenging to repair. Conventional cranioplasty using autografts or metal/polymer plates carries risks of donor-site morbidity, infection, and suboptimal contour. Patient-specific 3D-printed “living” implant, biodegradable scaffolds seeded with autologous cells, offer precise anatomical fit and active bone regeneration. Early clinical applications in countries including South Korea, Egypt, and Brazil demonstrate improved cosmetic outcomes, accelerated bone formation, and reduced operative time. Integrating tissue engineering with additive manufacturing represents a shift in neurosurgical reconstruction, providing biologically active solutions that restore both form and function. Ongoing research focuses on vascularization, cell viability, and clinical translation. Living 3D-printed cranial implants have the potential to redefine standards of care, emphasizing regenerative repair over inert prosthetic replacement.
- New
- Research Article
- 10.1161/circ.152.suppl_3.4368401
- Nov 4, 2025
- Circulation
- Rachel Shustak + 8 more
Background: Secondhand smoke (SHS) exposure affects over 40% of children in the US and increases their risk of respiratory infections, asthma, and sudden infant death syndrome. Children with heart disease may be particularly vulnerable due to altered cardiopulmonary physiology, including shunt lesions and baseline hypoxemia. While some pediatric primary care settings have adopted electronic health record (EHR)-linked systems to address parent tobacco use, their feasibility in pediatric subspecialty settings such as cardiology is unknown. Objective: To evaluate the feasibility of an automated clinical decision support tool within the EHR to provide smoking cessation counseling and treatment to parents and household members of children evaluated in an outpatient pediatric cardiology clinic. Methods: This prospective study was conducted at two outpatient pediatric cardiology clinics. All families presenting for follow-up visits received an EHR-linked questionnaire assessing caregiver and household tobacco use. The questionnaire directly screens for parent and household member tobacco use, delivers brief motivational messaging, and connects interested individuals to evidence-based treatment, including: nicotine replacement therapy (NRT) and/or counseling via phone or text. We used EHR utilization data to assess questionnaire completion rates, tobacco use identification, and treatment acceptance. Analyses were stratified by patient age, sex, race, insurance status, and neighborhood Child Opportunity Index (COI). Results: Between 3/19/25 and 5/26/25, 545 questionnaires were assigned of which 465 (85%) were completed. Parents who smoke were identified at 12 (2.6%) visits, and 7 (58%) were interested in and offered treatment, with 6 prescribed NRT. Other household members who smoke were identified at 24 (5.3%) of visits, and 4 (16.7%) were referred for treatment. There was no difference in questionnaire completion by patient age, sex, or race. Parents of children with Medicaid insurance and lower neighborhood COI were less likely to complete the assigned screener, but more likely to smoke (Figs. 1 and 2). There was no difference in treatment acceptance by demographic characteristics. Conclusion: An EHR-linked system for parental smoking cessation was feasible in pediatric cardiology clinics. Future research should evaluate its impact on clinical outcomes across diverse populations, especially in those with low COI and high-risk cardiac physiology.
- New
- Research Article
- 10.1161/circ.152.suppl_3.4357653
- Nov 4, 2025
- Circulation
- Mikiko Matsumura + 8 more
Background: Leadless pacemaker provides clinical advantages over traditional transvenous systems, including reduced risk of infections, thromboembolism, and lead-related complications, all of which adversely affect patient safety and quality of life. Recent advancement in the transition from VVI to VDD mode in leadless pacemaker is theoretically advantageous, as atrioventricular (AV) synchronization may enhance hemodynamic performance and alleviate symptoms of heart failure. However, the clinical impact of this mode transition has not systematically studied. Methods: This retrospective cohort study analyzed 133 consecutive patients (mean age: 85.6±6.5 years old, male: 54%, transition from VVI to VDD mode [Micra VR]: 63.9%) who diagnosed with AV block and underwent de novo implantation of leadless pacemakers between September 2017 and March 2024. The outcome measure was heart failure hospitalizations. Multivariate analyses were performed to identify factors associated with heart failure hospitalization within 3 years. Results: The follow-up periods were 323 days (interquartile range [IQR]: 132–613) for the Micra VR group and 392 days (IQR: 266–620) for the Micra AV group, statistically significant difference observed. Both groups had comparable patient backgrounds, with no significant differences in notable risk factors including age, sex, history of heart failure, presence of atrial fibrillation (AF), diabetes, or chronic kidney disease. The Kaplan-Meier analysis revealed that patients with Micra VR had a significantly higher rate of heart failure hospitalizations compared to those with Micra AV (Log Rank, p =0.037). Multivariate analysis showed that Micra VR (HR, 3.22; 95% CI, 1.15-8.98; P =0.026) and history of AF (HR, 5.69; 95% CI, 2.00-16.2; P =0.001) were significantly associated with heart failure hospitalization. Conclusion: The current study revealed that the transition from Micra VR to Micra AV was significantly associated with a reduced risk of heart failure hospitalizations compared to Micra VR over a follow-up period of up to 3 years, potentially due to improved atrioventricular synchrony.
- New
- Research Article
- 10.1007/s12026-025-09713-7
- Nov 4, 2025
- Immunologic research
- Karina Santana-De-Anda + 9 more
Patients with idiopathic inflammatory myopathies (IIM) are at increased risk for infections. Identifying clinical and immunological biomarkers predictive of infection is essential. We included 169 patients from the MYOTReCSZ cohort, all with ≥ 6 months of follow-up. Clinical data and laboratory parameters were collected, including: (1) low-density granulocytes and monocyte subsets, (2) serum cytokines, and (3) neutrophil extracellular trap (NET) quantification. The primary outcome was infection development. Most patients were female (72.78%), with a median age of 42. At least one infection occurred in 46.7% of patients; 55.6% were severe and 32.9% had recurrent infections. Independent predictors of infection included number of immunosuppressants (OR 1.7, P = 0.023), gastrointestinal activity score, cardiovascular damage-VAS, anti-Jo1 positivity (OR 10.0, P = 0.05), heliotrope rash, alopecia, and mycophenolate mofetil use (OR 11.9, P = 0.026). Severe infections were associated with number of immunosuppressants, low albumin, constitutional activity score, gastrointestinal damage-VAS, and TLR4⁺ intermediate monocytes (OR 1.0, P = 0.038). Recurrent infections correlated with lower TLR2⁺ classical monocytes (OR 0.4, P = 0.045), cumulative prednisone dose, global damage-VAS (OR 2.0, P = 0.0004), and anti-PM/Scl75 positivity (OR 3.8, P = 0.006). In conclusion, IIM patients with higher baseline activity and damage scores, specific autoantibodies, and altered innate immune cell phenotypes are more likely to develop infections. These parameters may serve as early biomarkers to stratify infection risk in clinical practice.
- New
- Research Article
- 10.1161/circ.152.suppl_3.4348886
- Nov 4, 2025
- Circulation
- Tanawat Attachaipanich + 2 more
Background: Cardiogenic shock is associated with high mortality and usually requires ventilatory support. Non-invasive ventilation (NIV) has demonstrated benefits in cardiogenic pulmonary edema, including reduced risk of infection and shorter hospital stays compared to invasive mechanical ventilation (IMV). However, the efficacy and safety of NIV specifically in cardiogenic shock remain unclear. This systematic review and meta-analysis aimed to evaluate the efficacy and safety of NIV compared to IMV in patients with cardiogenic shock. Methods: A systematic search was conducted across 4 databases, including PubMed, Embase, Web of Science, and Cochrane CENTRAL, from inception to February 12, 2025, without language restrictions. Studies were included if they compared the efficacy and safety of NIV and IMV in patients with cardiogenic shock. Results: A total of 6 studies involving 2,302 participants were included in this meta-analysis, using a random-effects model. NIV was associated with a significantly lower risk of in-hospital mortality compared to IMV, with a risk ratio (RR) of 0.70 (95%CI 0.52 to 0.94), p=0.02. NIV was also associated with a lower risk of 30-day all-cause mortality, with an RR of 0.63 (95%CI 0.51 to 0.77), p<0.01. NIV was associated with a shorter length of ICU/CCU stay (weighted mean difference (WMD) of -2.06 days; 95%CI -2.76 to -1.37; p<0.01) and hospital stay (WMD of -3.20 days; 95%CI -5.33 to -1.07; p<0.01) compared to IMV. Conclusions: NIV appears to be an effective and safe alternative to IMV in carefully selected patients with cardiogenic shock.
- New
- Research Article
- 10.1161/circ.152.suppl_3.4370391
- Nov 4, 2025
- Circulation
- Gaurav Sharma + 3 more
Background: Cardiac devices such as prosthetic valves, coronary bypass grafts, and implantable hardware have improved survival but are associated with increased infection risk. Infective endocarditis (IE) remains a life-threatening complication of device-related infections. Despite advances in antimicrobial strategies and perioperative protocols, long-term national trends in endocarditis-related mortality with cardiac device complications remain under-characterized. Methods: Using the CDC WONDER Multiple Cause of Death database (1999–2023), we identified decedents aged 15–84 years in whom infective endocarditis (ICD-10: I33.0, I33.9, I38) and cardiac device complications (ICD-10: T82.0, T82.2, T82.6, T82.7) were listed as contributing causes of death. Age-adjusted mortality rates (AAMRs) per 100,000 population were calculated using the 2000 U.S. standard population. Joinpoint regression (v5.4.0) identified inflection points in trends and calculated Annual Percent Change (APC). Results: From 1999 to 2023, a final model with two joinpoints was identified. Between 1999 and 2006, AAMRs increased significantly with an APC of +8.00% (95% CI: 2.06 to 14.29; p = 0.01). This was followed by a sharp decline from 2006 to 2011 (APC = –10.30%; 95% CI: –20.90 to +1.72; p = 0.086). Between 2011 and 2023, mortality trends plateaued, with a modest but non-significant rise (APC = +0.71%; 95% CI: –1.57 to +3.06; p = 0.52). Overall, mortality rates stabilized over the last decade despite the earlier fluctuations. Conclusion: The trajectory of endocarditis-related mortality in patients with cardiac device complications reveals a critical inflection in modern cardiovascular care. The initial rise mirrors a surge in device utilization without parallel infection safeguards. The subsequent decline suggests early wins from antimicrobial stewardship and surgical protocol refinement. However, the post-2011 plateau, despite advances in materials and perioperative care, signals a stagnation point, not success. This stagnation likely reflects unresolved mechanistic challenges such as biofilm resilience, hematogenous microbial seeding, and late-onset device colonization. These findings position infective endocarditis as a high-fidelity surrogate for late device-related mortality burden. Urgent innovation is needed: biocompatible surface technologies, sustained post-implantation surveillance, and precision infection diagnostics must now lead the next phase of device-era infection prevention.
- New
- Research Article
- 10.1002/advs.202512592
- Nov 4, 2025
- Advanced science (Weinheim, Baden-Wurttemberg, Germany)
- Yinghao Wu + 9 more
High infection risk and poor tissue integration are major causes of percutaneous implant failure. Immune reprogramming is a promising strategy, but current implants rely predominantly on passive and static modulation of macrophages. Dynamically reprogramming macrophages based on physiological states to switch between antibacterial and tissue-healing functions remains a challenge. Here, a novel polyetheretherketone (PEEK) surface is developed sequentially modified through sulfonation, hydrogen plasma immersion ion implantation (H-PIII), and magnesium plasma immersion ion implantation (Mg-PIII) to fabricate a Mg-H-SPEEK composite with a graphene-like matrix embedded with MgO. This film simultaneously enables sustained Mg2⁺ release with near-infrared (NIR) photothermal responsiveness for on-demand immunomodulation. Under normal conditions, Mg2⁺ released from Mg-H-SPEEK can promote macrophage reprogramming toward the M2 phenotype through the TNF, JAK-STAT, NF-κB, and IL-17 pathways to accelerate soft tissue repair. Upon NIR light exposure, photothermal stimulation enhances the expression of the Traf1 in macrophages via the TNF, FoxO, and JAK-STAT signaling pathways to drive M1 reprogramming for bacterial phagocytosis. The dual-mode system synergizes hyperthermia and immune phagocytosis for infection resistance while preserving pro-healing functions, offering a smart strategy for percutaneous implants.
- New
- Research Article
- 10.1002/acr2.70117
- Nov 3, 2025
- ACR Open Rheumatology
- Leah K Flatman + 6 more
ObjectiveTumor necrosis factor inhibitors (TNFi) are used by over 20% of pregnant women with chronic inflammatory diseases, which could further impede immune function and increase the risk of infections that require hospitalization. We assessed the risk of serious infections during pregnancy and postpartum between women exposed and unexposed to TNFi with chronic inflammatory diseases.MethodsUsing MarketScan, we identified pregnant women with chronic inflammatory diseases and modeled TNFi exposure during pregnancy and postpartum as a time‐varying variable. Cox proportional hazards models estimated adjusted hazard ratios (HRs) for TNFi and the risk of hospitalized infection.ResultsWe followed a total of 62,813 women who had 70,529 pregnancies and 69,412 births. Among these, 4,485 (7.1%) women were exposed to at least one TNFi prescription during pregnancy and 3,559 women during postpartum. Overall, 449 women were hospitalized for infection during pregnancy, including 31 pregnant women who were exposed to TNFi. During postpartum, 205 women had hospitalized infection, of which 17 women were TNFi‐exposed. Compared with no TNFi exposure, TNFi treatment during pregnancy was associated with a HR of 1.39 (95% confidence interval [CI], 0.95–2.05) for serious infections, whereas the HR during postpartum was 1.22 (95% CI, 0.72–2.06).ConclusionIn this population‐based study, we found no statistically significant association between TNFi exposure during pregnancy and serious infection risk. Although point estimates were higher for pregnant women exposed to TNFi, CIs were wide and included the null, indicating that an increased risk cannot be ruled out. Given the frequency of TNFi treatment for pregnant women, these results support continued investigation and may inform counseling regarding TNFi treatment during pregnancy and postpartum.
- New
- Research Article
- 10.3389/fmed.2025.1694688
- Nov 3, 2025
- Frontiers in Medicine
- Zenan Tang + 3 more
Background Janus kinase (JAK)-1 inhibitors have been approved for moderate-to-severe atopic dermatitis (AD). Despite favorable efficacy, their real-world infection risk profile requires further investigation. Methods We conducted a retrospective disproportionality analysis using the U.S. Food and Drug Administration Adverse Event Reporting System (FAERS) database. Reports identifying upadacitinib or abrocitinib as primary suspect drugs for “Infections and Infestations” adverse events (AEs) in AD treatment from Q3 2019 to Q1 2025 were included. Four disproportionality methods were employed to detect infection-related safety signals. Results A total of 18 infection-related positive safety signals associated with abrocitinib were identified, which include known AEs (herpes zoster, eczema herpeticum, and herpes simplex) and unexpected signals (sepsis, appendicitis, and septic shock). Upadacitinib showed 64 infection-related signals, encompassing known AEs (herpes zoster, pneumonia, and influenza) and unexpected signals (sepsis, appendicitis, and septic shock). Herpes zoster was the most frequent infection-related AE for both drugs. Conclusion This study confirms established infection risks of JAK-1 inhibitors in AD (particularly herpes zoster) and identifies novel potential safety signals (sepsis, appendicitis, and septic shock). These findings provide real-world insights into the risk of infections associated with JAK inhibitors.
- New
- Research Article
- 10.1177/00031348251393928
- Nov 3, 2025
- The American surgeon
- Paul Brosnihan + 6 more
IntroductionBile spillage (BS) is common during laparoscopic cholecystectomy (LC) and has been shown to be associated with an increased risk of surgical site infection (SSI). We hypothesized that positive bile cultures (PBCs) increase the risk of postoperative complication including SSI.MethodsA retrospective chart review was conducted including all patients older than 18 years undergoing urgent LC from January to September 2019. Charts were reviewed for the index admission and postoperative visits. We compared those who had PBCs with those who did not. Our primary endpoint was the rate of SSI. Univariate analysis and multivariate logistic regression were used to identify predictors of SSI.Results272 patients underwent LC. Indications for operation included acute cholecystitis (62.5%), symptomatic cholelithiasis (12.5%), and other indications (25%). Bile was spilled in 191 patients (70.2%). Positive bile cultures were obtained in 78 of 249 (31.3%) patients and were associated with preoperative endoscopic retrograde cholangiopancreatography (ERCP; 44% vs 26%, P = 0.014) and drain placement (50% vs 28.9%, P = 0.031). Eleven postoperative complications were noted, including 6 SSI (2.2%). Positive bile culture (3.8% vs 1.8%, P = 0.38), BS (3.1% vs 0%, P = 0.18), ERCP (1.4% vs 2.5%, P = 1.0), and drain placement (6.7% vs 1.7%, P = 0.13) were not associated with SSI. Multivariate analysis demonstrated that positive cultures were not predictive of complication (P = 0.13) or SSI (P = 0.91).ConclusionPositive bile cultures are not inherently associated with an increased risk of SSI and therefore should not lead to ongoing postoperative antibiotic therapy.
- New
- Research Article
- 10.1002/adfm.202518001
- Nov 3, 2025
- Advanced Functional Materials
- Jian Li + 9 more
Abstract Epilepsy is one of the most prevalent central nervous system disorders, with antiepileptic drugs (AEDs) as the mainstay of treatment. While effective in reducing seizure frequency in ≈70% of patients, traditional AEDs are limited by blood concentration fluctuations, drug resistance, and cognitive side effects. Neuromodulation, particularly electrical stimulation, has emerged as a promising alternative by reversibly regulating abnormal neural circuits. However, conventional systems require implanted electrodes and external power sources, increasing the risk of trauma and infection. Piezoelectric nanomaterials offer a non‐invasive strategy by converting endogenous biomechanical forces or ultrasound stimulation into localized electric currents, inducing neuronal hyperpolarization to suppress excitation. Based on this mechanism, a biomimetic piezoelectric nanoplatform is developed capable of ultrasound‐triggered electrical stimulation for targeted neuromodulation without surgical implantation. Additionally, these nanoplatforms can co‐deliver AEDs, enabling a dual therapeutic approach that combines localized stimulation with sustained drug release, enhancing efficacy while minimizing systemic exposure. This synergistic integration of ultrasound‐responsive piezoelectric nanoplatforms and pharmacotherapy represents a transformative paradigm for safe, effective, and non‐invasive epilepsy treatment.
- New
- Research Article
- 10.3389/fpubh.2025.1604049
- Nov 3, 2025
- Frontiers in Public Health
- Niannian Bi + 9 more
Background Hepatitis E virus (HEV) is a major global public-health threat. University students are at high risk of HEV infection. This study aimed to assess the knowledge, attitude, and practice (KAP) levels regarding hepatitis E among university freshmen and their willingness to receive HEV vaccination. Methods A cross-sectional study was conducted from September to December 2023 among 3,276 freshmen from six universities in Anhui Province, China. Data were collected using structured questionnaires. A stratified cluster random sampling method was used to select participants. Multivariate logistic regression was performed to identify factors associated with KAP levels. Data were analyzed with SPSS version 23.0. Results Of the 3,276 questionnaires distributed, 3,120 were valid, with a response rate of 95.2%. Only 9.0% of participants had received the HEV vaccine. The overall correct knowledge rate of HEV was 50.8%. A positive attitude was reported by 59.9% of students, and 60.9% demonstrated good practices related to HEV. Multivariate analysis showed that vaccinated students had significantly higher knowledge levels than non-vaccinated students ( OR = 1.999, 95% CI : 1.536–2.602). Female students ( OR = 1.193, 95% CI : 1.029–1.382) and those from Wuhu ( OR = 1.571, 95% CI : 1.299–1.900) also had higher knowledge levels. Medical students were more likely to have a positive attitude than non-medical students ( OR = 1.367, 95% CI : 1.161–1.610). Students from rural areas ( OR = 1.336, 95% CI : 1.148–1.553) and Wuhu ( OR = 1.317, 95% CI : 1.088–1.594) showed higher levels of positive attitude. Rural students also reported better health practices than urban students ( OR = 1.288, 95% CI : 1.088–1.524). The result also showed both knowledge ( r = 0.042, P = 0.020) and attitude ( r = 0.049, P = 0.006) exhibited statistically significant but weak positive correlation with practice. Conclusions Over half of the university freshmen demonstrated good KAP levels regarding HEV. However, the vaccination rate remained low. Therefore, determinants identified will guide health promotion and vaccine advocacy.
- New
- Research Article
- 10.1177/00333549251378101
- Nov 3, 2025
- Public health reports (Washington, D.C. : 1974)
- Ariel Christensen + 10 more
Wastewater monitoring is a useful tool to complement case-based surveillance. A hepatitis A virus (HAV) investigation in North Carolina demonstrated that wastewater monitoring detections preceded identification of 2 clinical cases by 12 days. State and local health officials used a preestablished decision tree to respond to wastewater detections of HAV and implement public health actions. The investigation determined that HAV detected in wastewater was likely from 2 people who had not yet developed symptoms or sought testing at the time of detection, providing early information for public health response, including vaccination of family members. Targeted outreach to hospitals as well as medically or socially vulnerable groups at high risk of HAV infection could be recommended in response to consistent HAV detections in wastewater.
- New
- Research Article
- 10.1177/10848223251385553
- Nov 3, 2025
- Home Health Care Management & Practice
- Katsutoshi Ando + 8 more
Background: Old age, comorbidities, and vaccination status are known risk factors for hospitalization and mortality in COVID-19 patients. However, the clinical course and outcomes among those receiving home medical care remain unclear. Methods: We retrospectively reviewed 2598 patients who received doctor-visiting care from 10 clinics in Meguro-city, Tokyo, Japan, between January 2022 and September 2023. Among them, 194 patients diagnosed with COVID-19 after initiating home care were analyzed. Patients were classified into hospitalization versus non-hospitalization groups, and survival versus non-survival groups for background comparison. Results: Among the 194 patients enrolled, COVID-19 severity emerged as a significant risk factor for hospitalization and mortality. Meanwhile, the level of daily life independence for disabled elderly patients was an independent risk factor for hospitalization but not for survival. A ROC analysis revealed that the “optimal” cutoff value for the daily life independence level of disabled associated with hospitalization was Rank “B1,” which was a person who requires some assistance living indoors and spends most of the day in bed but can sit up. The 3- and 6-month survival rates for all enrolled patients were 89.1% and 83.9%, respectively. The non-survival group had a significantly higher proportion of patients with malignancies and a higher Charlson comorbidity index than the survival group. Conclusions: In home medical care patients, severe COVID-19 infection and lower independence increase hospitalization risk, while malignancies and comorbidities affect mortality, highlighting the importance of assessing functional status and comorbidity profiles to guide clinical decisions for COVID-19 management in home care settings.
- New
- Research Article
- 10.1371/journal.pntd.0013660
- Nov 3, 2025
- PLoS neglected tropical diseases
- Luciana Lobato Cardim + 12 more
People living in economically disadvantaged circumstances experience higher risks of infections and death from arboviruses. However, more evidence is needed to better understand the socioeconomic factors influencing dengue mortality. We investigated if people of lower socioeconomic conditions in Brazil are more likely to die following dengue infection. Linking nationwide socioeconomic data from the 100 Million Brazilian Cohort with dengue disease and death records registered in Brazil between 1st January 2007 and 31st December 2018, we used multivariable hierarchical analysis to investigate the socioeconomic determinants of dengue-specific and all-cause mortality within 15 days of dengue symptom onset. Among the 3,018,131 individuals from the 100 Million Brazilian Cohort diagnosed with dengue, 1810 died from dengue (Case Fatality Rate (CFR)=0.06%, 95%CI = 0.06-0.06%) and 3076 (CFR = 0.10%, 95%CI = 0.10-0.11%) died from any cause within 15 days of dengue symptom onset. People residing in the Northeast (OR=2.32; 95%CI = 1.74-3.10) and Midwest (OR=1.68; 95%CI = 1.25-2.27) regions, self-identifying as black race/ethnicity (OR=1.58; 95%CI = 1.31-1.90), having lower level of education (OR=2.35, 95%CI = 1.17-4.73), being retired/receiving pension (OR=2.24; 95%CI = 1.76-2.86), living in a household with rudimentary sewage (OR=1.19; 95%CI = 1.04-1.37) and having >2 inhabitants per room (OR=1.31; 95%CI = 1.11-1.55) had at higher odds of dengue-specific mortality. Similar characteristics were also associated with higher all-cause mortality after dengue infection, but also included residing in North region (OR=1.60; 95%CI = 1.24-2.06) and rural areas (OR=1.12; 95%CI = 1.01-1.24), self-identifying as Asian (OR=1.65; 95%CI = 1.07-2.54) and mixed race/brown (OR=1.20; 95%CI = 1.10-1.31) and living in households with poorer quality building and sanitary conditions. Our findings provide evidence that individuals in Brazil with lower socioeconomic condition experience increased odds of dengue-specific and all-cause mortality within 15 days of dengue symptom onset. These findings underscore the importance of ensuring equitable access to high-quality treatment for severe dengue and suggest that reducing poverty and social inequality, including through improvement of sanitation and housing, may help mitigate dengue-related mortality.