Abstract

Samples of serum and a series of aqueous standards of SeO2 were dried on to the wall of an aluminium cathode cup at a temperature of 293 K, thus ensuring a sample preparation procedure without loss of analyte. Using the Se II 444.62-nm and Se II 444.95-nm spectral lines, the detection limit for the proposed method was established as being 1 ng ml–1. The selenium content of the samples varied between 80 and 120 ng ml–1, which is two orders of magnitude higher than the detection limit of the proposed method. The sodium content of the serum did not interfere in the determination, as the amount of sodium present (140 mmol 1–1) was less than the concentration which can cause a decrease in the spectral line intensity. The relative error of the concentration values is less than 5%. The determination of selenium in serum was performed using direct hollow-cathode discharge atomic emission spectrographic analysis. Optimum conditions for the determination were established. It was found that sputtering of selenium could be achieved, with a satisfactory detection limit and acceptable accuracy, at a power of 240 W. Water cooling was at a flow-rate of 8 I min–1, which provided equal temperatures of the neutral gas and cathode of 430 ± 50 K, which is lower than the melting-points of Se and SeO2.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.