Abstract

AbstractThe intensity ratio (αI) and area ratio (αA) obtained using Raman spectroscopy are used widely to characterize the physical properties of gases, fluids, and solids, but the underlying parameters which affect their precision ( and ) have not been elucidated. We use simulations, experiments, and theoretical analyses to investigate the effects of instrumental performance, analytical conditions, and sample size on and . We identified the parameters which strongly influence : (1) the weaker peak intensity Iw; (2) the ratio of the bandwidth to pixel resolution of the weaker peak Γw/Δxw; (3) the degree of detector saturation; (4) with readout noise σR, dark noise σD, and shot noise σS; (5) drift; and (6) sample size. Theoretical and simulation results show that, when Γw/Δxw ≪ (Γs/Δxs)/αI, increasing Iw or Γw/Δxw by n times can improve by a factor of . Results showed that is a measure of how much the achievable in the experiment differs from that under ideal analytical conditions (i.e., ). When is not the case (i.e., when Iw is low), reducing σD or σR by decreasing the readout speed, setting a lower charge coupled device (CCD) sensitivity, reducing number of readouts, or narrowing vertical binning width can improve . We demonstrated that evaluations of or based on small sample sizes can engender physically unattainable precision. When the sample size is n = 5, 10, or 20, the average uncertainties of and correspond, respectively, to approximately 30%, 20%, and 15%. Giving an accurate and comparable assessment for and necessitates the reporting of intensity (in units of electrons), bandwidth, Δx, and the sample size.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call