ObjectiveKidney stones are a major issue for public health worldwide. Discovering potential clues in identifying at-risk individuals is essential for early detection and timely treatment. This study explores the relationship of the neutrophil-to-high-density lipoprotein cholesterol ratio (NHR) with the risk of kidney stones in U.S. adults.MethodsThe analysis involved 24,532 participants with available NHR and kidney stone data from the 2007–2018 NHANES period. Multivariable logistic regression models were used to quantify the relationship between NHR and kidney stone occurrence. Subgroup analyses were conducted to explore variations in effect.ResultsA total of 2,351 participants (9.93%) were diagnosed with kidney stones, and their mean age was 47.20 ± 0.26 years. After full adjustment in the multivariable regression model, higher NHR levels were linked to a greater risk of kidney stones (OR = 1.05, 95% CI: 1.02–1.08, P = 0.002). Participants in the highest tertile of NHR had a 34% increased chance of kidney stone development compared to those in the lowest tertile. A nonlinear connection between NHR and kidney stone risk was identified using restricted cubic spline (RCS) regression models. The relationship between NHR and kidney stone prevalence showed no significant variation across most subgroups (P for interaction > 0.05).ConclusionThe results indicate that increased NHR is linked to a higher risk of kidney stones, with this relationship remaining consistent across various populations. NHR could be a useful biomarker for kidney stone risk, with key implications for early detection and individualized treatment.
Read full abstract