We study the problem of rate control for transmission of video over burst-error wireless channels, i.e., channels such that errors tend to occur in clusters during fading periods. In particular we consider a scenario consisting of packet based transmission with automatic repeat request (ARQ) error control and a back channel. We start by showing how the delay constraints in real time video transmission can be translated into rate constraints at the encoder, where the applicable rate constraints at a given time depend on future channel rates. With the acknowledgments received through the back channel we have an estimate of the current channel state. This information, combined with an a priori model of the channel, allows us to statistically model the future channel rates. Thus the rate constraints at the encoder can be expressed in terms of the expected channel behavior. We can then formalize a rate distortion optimization problem, namely, that of assigning quantizers to each of the video blocks stored in the encoder buffer such that the quality of the received video is maximized. This requires that the rate constraints be included in the optimization, since violating a rate constraint is equivalent to violating a delay constraint and thus results in losing a video block. We formalize two possible approaches. The first one seeks to minimize the distortion for the expected rate constraints given the channel model and current observation. The second approach seeks to allocate bits so as to minimize the expected distortion for the given model. We use both dynamic programming and Lagrangian optimization approaches to solve these problems. Our simulation results demonstrate that both the video distortion at the decoder and packet loss rate can be significantly reduced when incorporating the channel information provided by the feedback channel and the a priori model into the rate control algorithm.