Abstract

The polydispersity problem comes about when using laser light scattering to measure the Brownian motion of macromolecules and hence, essentially, to measure their size. The spectrum of intensity fluctuations of light scattered by a suspension of identical particles can be shown to be of Lorentzian shape with half-width proportional to the diffusion constant of the molecules in suspension. A pratical technique, now widely used, measures, by special high-speed digital hardware, the autocorrelation function of the digital stream of detected single-photon events (photon correlation spectroscopy, or PCS), which is related by the Wiener Khintchine theorem to the spectrum and thus gives rise to an exponential function of decay constant proportional to the inverse of the diffusion constant. In a typical experiment, 107 or more scattered single photons will by analysed using digital circuitry and a digital read-out of the exponential correlation function will be available with accuracy of the order of 0.1% on each point. These points are chosen to cover a few decay times in either a linear or more preferably an integrated logarithmic sampling scheme. The data reduction problem in this monodisperse case, namely fitting a single exponential curve, poses no serious problem. When particles of a number of different molecular sizes, however, are present — the polydisperse case — each size fraction gives rise to its own exponential curve and hence the total scattering produces an autocorrelation function which is a sum of exponentials and the data reduction problem becomes one of Laplace transform inversion.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call