Abstract

We evaluated 16,351 visual field (VF) tests from Ocular Hypertension Treatment Study (OHTS) database and showed that more frequent testing resulted in a shorter time to detect glaucoma progression, with the best trade-off being the 6-month intervals for high-risk and 12 months for low-risk patients. To investigate the effect of different testing intervals on time to detect visual field progression in eyes with ocular hypertension. A total of 16,351 reliable 30-2 VF tests from 1575 eyes of the OHTS-1 observation arm with a mean (95% CI) follow-up of 4.8 (4.7-4.8) years were analyzed. Computer simulations (n = 10,000 eyes) based on mean deviation values and the residuals of risk groups (according to their baseline 5 y risk of developing primary open angle glaucoma: low, medium, and high risk) were performed to estimate time to detect progression with testing intervals of 4, 6, 12, and 24 months using linear regression. The time to detect VF progression ( P < 5%) at 80% power was calculated based on the mean deviation slope of -0.42 dB/year. We assessed the time to detect a -3 dB loss as an estimate of clinically meaningful perimetric loss. At 80% power, based on the progression of -0.42 dB/year, the best trade-off to detect significant rates of VF change to clinically meaningful perimetric loss in high, medium, and low-risk patients was 6, 6, and 12-month intervals, respectively. Given the importance of not missing the conversion to glaucoma, the frequency of testing used in OHTS (6 mo) was optimal for the detection of progression in high-risk patients. Low-risk patients could potentially be tested every 12 months to optimize resource utilization.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call