Abstract

Automated perimetry is relied on for functional assessment of patients with glaucoma, but questions remain about its effective dynamic range and its utility for quantifying rates of progression at different stages of the disease. This study aims to identify the bounds within which estimates of rate are most reliable. Pointwise longitudinal signal-to-noise ratios (LSNR), defined as the rate of change divided by the standard error of the trend line, were calculated for 542 eyes of 273 patients with glaucoma/suspects. The relations between the mean sensitivity within each series and lower percentiles of the distribution of LSNRs (representing progressing series) were analyzed by quantile regression, with 95% confidence intervals derived by bootstrapping. The 5th and 10th percentiles of LSNRs reached a minimum at sensitivities 17 to 21 dB. Below this, estimates of rate became more variable, making LSNRs of progressing series less negative. A significant step change in these percentiles also occurred at approximately 31 dB, above which LSNRs of progressing locations became less negative. The lower bound of maximum utility for perimetry was ∼17 to 21dB, coinciding with previous results suggesting that below this point, retinal ganglion cell responses saturate and noise overwhelms remaining signal. The upper bound was ∼30 to 31 dB, coinciding with previous results suggesting that above this point, the size III stimulus used is larger than Ricco's area of complete spatial summation. These results quantify the impact of these two factors on the ability to monitor progression and provide quantifiable targets for attempts to improve perimetry.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call