Abstract
Various global indices are available to summarize results from standard automated perimetry. This study asks which index can detect significant deterioration earliest, for a fixed specificity. Comparison of prognostic indices. Two cohorts were tested. A test-retest cohort contained 5 reliable visual fields, within a short interval, from 45 eyes of 23 participants with glaucoma and/or likelihood of developing glaucoma. A separate longitudinal cohort contained 508 eyes from 330 participants, tested on average 13 times. Three global indices were extracted: mean deviation (MD), pattern standard deviation (PSD), and visual field index (VFI). For each index we defined a critical P value CritIndex, such that 5% of test-retest series showed significant deterioration with P < CritIndex, using artificial "test dates" in random order. Therefore these criteria have 95% specificity over series of 5 tests. The times to detect significant deterioration in the longitudinal cohort were compared using a survival analysis model. The median time to detect significant deterioration with MD was 7.3 years (95% confidence interval [CI] 6.8-7.9 years). For VFI, the median was 8.5 years (95% CI 7.9-9.0 years); this comparison had P=.088. For PSD, the median was 10.5 years (95% CI 9.3-11.7 years), slower than MD with P < .001. Within the first 5 years of a series, MD detected significant deterioration in 138 eyes, vs 104 for VFI (P=.0013) and 107 for PSD (P= .029). MD detected significant deterioration sooner than VFI or PSD. In particular, MD detected more eyes in the first 5 years of their follow-up, which were presumably undergoing more rapid progression.
Accepted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have