Abstract

Standard automated perimetry (SAP) demonstrates high variability. Structural tests such as optical coherence tomography (OCT) may be more repeatable. However, comparisons of their ability to detect glaucomatous change are challenging due to different units and dynamic ranges. This study demonstrates a signal-to-noise analysis to perform comparisons within a common framework. Longitudinal data were used from 226 eyes of 130 subjects with nonendstage glaucoma (mean deviation [MD] from -19.50 to 2.89 dB). Subjects were tested twice a year for a total of at least six visits. For each eye, MD from SAP and average retinal nerve fiber layer thickness (RNFLT) from OCT were regressed linearly against time. 'Signal' was defined as the rate of change over time, while 'noise' was defined as the SD of residuals from this trend. Individual longitudinal signal-to-noise ratios were calculated. A summary quantification was also calculated, using the 10th percentile of these rates within the cohort as signal and the SD of residuals pooled across all eyes as noise. Individual signal-to-noise ratios were significantly better for OCT RNFLT than for SAP MD (P < 0.0001). The summary quantification of signal-to-noise ratio was better for OCT RNFLT (-1.35 y-1) than for SAP MD (-0.74 y-1). RNFLT measured by OCT had a better longitudinal signal-to-noise ratio than MD from SAP. The longitudinal signal-to-noise ratio provides a means to perform a fair comparison between different techniques, which is robust to differences in scale and measurement units. Longitudinal studies in glaucoma should consider reporting signal-to-noise ratios to facilitate interpretation and comparison of results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call