Abstract

The physical principles underpinning Taylor dispersion offer a high dynamic range to characterize the hydrodynamic radius of particles. While Taylor dispersion grants the ability to measure radius within nearly 5 orders of magnitude, the detection of particles is never instantaneous. It requires a finite sample volume, a finite detector area, and a finite detection time for measuring absorbance. First we show that these practical requirements bias the analysis when the self-diffusion coefficient of particles is high, which is typically the case of small nanoparticles. Second we show that the accuracy of the technique may be recovered by treating Taylor dispersion as a linear time-invariant system, which we prove by analyzing the Taylor dispersion spectra of two iron-oxide nanoparticles measured under identical experimental conditions. The consequence is that such treatment may be necessary whenever Taylor dispersion analysis is not optimized for a given size but dedicated to characterize broad groups of particles of varying size and material.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.