This paper analyzes the effect of a processing delay on the Least Mean Squares (LMS) algorithm for a system identification problem when the processing delay is in the adaptive arm of the filter. Thus the sensing of the input signal is delayed. The input is assumed to be a zero mean stationary Gaussian process. The theoretical mean and mean square behavior of the adaptive weight vector is analyzed. The weight vector is shown to be biased and significantly affects the mean square deviation (MSD). Monte Carlo simulations are presented in support of the assumptions used to derive the theoretical model as a function of the delay, bandwidth and step-size for the LMS algorithm. The results suggest bias problems with other more complicated adaptive filtering algorithms such as Normalized LMS and Recursive Least Squares.