Psychophysical studies have shown that relative disparity, the difference of two absolute disparities, is important in perceptual judgements such as stereoacuity. A recent study showed that a small but significant population of V2 neurons signals relative disparity, whereas V1 neurons do not (Thomas et al. 1999, SFN Abstr.). A logical hypothesis is that higher visual areas may contain more neurons signaling relative disparity. In this study, we tested whether MT neurons signal relative disparity between their classical receptive field (CRF) and surrounding regions. We recorded 40 neurons from 3 awake fixating rhesus monkeys. A bi-partite (center/surround) random-dot stereogram was presented, with the center patch covering the neurons' CRF. Dots in both the center and surround patches moved at the neuron's preferred velocity, and both the center and surround disparities varied from trial to trial. Disparity tuning of responses to the center patch was obtained for each of 3–5 surround disparities, and each tuning curve was fit with a gabor function. If neurons signal relative disparity, the tuning curves should shift by an amount equal to the surround disparity. For each possible pair of surround disparities, we calculated the shift in the peak or trough of the tuning curves relative to the difference in surround disparity (shift ratio). A shift ratio of 1 indicates that the shift was equal to the difference in surround disparity, consistent with relative disparity encoding. A shift ratio of 0 indicates that there was no shift with surround disparity, consistent with absolute disparity encoding. Although the median shift ratio of 0.041 was significantly different from 0 (sign-test, p=0.0005, n=209 shifts), the distribution was tightly clustered around 0, and only 1% (2/209) of shift ratios were larger than 0.5. The results suggest that MT neurons do not signal relative disparity in a center-surround configuration, but rather signal the absolute disparity in their CRF.