Abstract

Time-domain diffuse correlation spectroscopy (TD-DCS) is an emerging noninvasive optical technique with the potential to resolve blood flow (BF) and optical coefficients (reduced scattering and absorption) in depth. Here, we study the effects of finite temporal resolution and gate width in a realistic TD-DCS experiment. We provide a model for retrieving the BF from gated intensity autocorrelations based on the instrument response function, which allows for the use of broad time gates. This, in turn, enables a higher signal-to-noise ratio that is critical for in vivo applications. In numerical simulations, the use of the proposed model reduces the error in the estimated late gate BF from 34% to 3%. Simulations are also performed for a wide set of optical properties and source-detector separations. In a homogeneous phantom experiment, the discrepancy between later gates BF index and ungated BF index is reduced from 37% to 2%. This work not only provides a tool for data analysis but also physical insights, which can be useful for studying and optimizing the system performance.

Highlights

  • Diffuse correlation spectroscopy (DCS) is an optical technique that measures the scatterer’s motion in diffusive media, traditionally, by injecting a coherent continuous wave (CW) laser beam in a turbid sample and by characterizing the speckle fluctuations by their intensity autocorrelation function

  • We show a comparison between the retrieved blood flow index (BFI) using the uncorrected and instrument response function (IRF)-corrected models, together with error-bars indicating their standard deviations along the 20 s blocks, and the average ungated value retrieved fitting the corresponding autocorrelations with the CW solution of the correlation diffusion equation.[1]

  • We have proposed a model to describe a realistic Time-domain diffuse correlation spectroscopy (TD-DCS) experiment, characterized by a finite temporal resolution, based on measurable or known quantities, such as the IRF and the time/path length gate limits

Read more

Summary

Introduction

Diffuse correlation spectroscopy (DCS) is an optical technique that measures the scatterer’s motion in diffusive media, traditionally, by injecting a coherent continuous wave (CW) laser beam in a turbid sample and by characterizing the speckle fluctuations by their intensity autocorrelation function. A hardware gating acquisition scheme, which uses fast-gated single-photon detectors enabling very short source– detector (SD) separation measurements, was recently demonstrated.[9] Recently, a theoretical model for TD-DCS in multilayer turbid media has been proposed for improving data analysis in nonhomogeneous biological tissues.[10] Another method to achieve path length resolved DCS, initiated by the works of Tualle et al.,[11,12] has been recently shown to be available in vivo on small animals.[13] While in TD-DCS, path lengths are resolved based on their ToF, and interferometric measurements achieve the same goal by sweeping source wavelength in time.

Theory
Monte Carlo Simulator
Experimental Set-up
Phantom Studies and Data Analysis
Simulations
Simulations for Different Optical Properties and Source-Detector Separations
Experiments
Discussion and Conclusion
Findings
Coherence Length Effects on Amplitude and Decay-Rate of the Autocorrelations
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call