Abstract

The degree of self-absorption of spectral lines emitted by a flame is reflected in the amount of absorption that takes place when this light is passed through a secondary absorption flame. Assuming a certain model for the flame emission primary source, theoretical working curves were calculated for varying concentrations and different distributions of ground state atoms. The sensitivity of absorption (for low optical densities in the absorber) is markedly dependent on the distribution of ground state atoms in the emission flame. Using three different types of emission flames, experimental measurements of the relative sensitivities as functions of concentration were made. These curves were fitted to the calculated curves of relative sensitivity vs optical density. Two results were obtained, viz. the optical density corresponding to a certain solution concentration and the relative distribution of absorbing atoms in the emission flame. For a spectral line without hyperfine structure (Ca 4227 Å) the optical density thus found agreed well with the results from conventional atomic absorption measurements, i.e. using the flame under investigation as the absorbing medium with a hollow cathode lamp as primary source.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.