This paper analyzes an adaptive training algorithm for adjusting the tap weights of a tapped delay line filter to minimize mean-square intersymbol interference for synchronous data transmission. The significant feature of the adjustment procedure is that convergence is guaranteed for all channel response pulses, even for very severe amplitude and phase distortion. The author examines convergence, rate of convergence, and the effect of noisy observations of the received pulses, and he shows that the noisy observations result in a random sequence of tap weight settings whose mean value converges to a suboptimal setting. The mean-square deviation of the tap weights from the suboptimal values is asymptotically bounded with a bound that can be made as small as desired by sufficiently reducing the speed of convergence. The suboptimality arising here results from the use of isolated test pulses for the training signal. However, a training scheme using pseudorandom sequences or the actual data signal does not suffer from the suboptimality effect. Hence, although of possible utility in other pulse shaping applications, the technique presented here appears to be primarily of value in providing a conceptual framework for the closely related but more practical techniques to be examined in the sequel to this paper to be published shortly.