Abstract

The effects of conductivity detection temperature on calibration sensitivity and linearity in suppressed ion chromatography using hydronium or hydroxide eluent were investigated. Theoretical calibration curves for lithium and nitrate ions at 0–35 °C were calculated and compared with experimental data. As the detection temperature was lowered, both sensitivity and linearity of calibration at low concentrations were improved due to the reduced interference by water autoionization equilibrium; 4.3‐ and 1.3‐fold increases in linear regression slopes were observed in the 0–1 µmol/L range when the temperature was lowered from 35 to 5 °C for lithium and nitrate, respectively, along with significant increases in the correlation coefficient. Any remaining water autoionization effect was near completely removed by using eluents contaminated with rubidium or bromide ion at 0.1 µmol/L.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call