Density evolution is often used to determine the performance of an ensemble of low-density parity-check (LDPC) codes under iterative message-passing algorithms. Conventional density evolution techniques over memoryless channels are based on the assumption that messages at iteration l are only a function of the messages at iteration l -1 and possibly the channel output. This assumption is valid for many algorithms such as standard belief propagation (BP) and min-sum (MS) algorithms. However, there are other important iterative algorithms such as successive relaxation (SR) versions of BP and MS, and differential decoding with binary message passing (DD-BMP) algorithm of Mobini et al., for which this assumption is not valid. The reason is the introduction of memory in these algorithms. In this work, we propose a model for iterative decoding algorithms with memory which covers SR and DD-BMP algorithms as special cases. Based on this model, we derive a Bayesian network for iterative algorithms with memory over memoryless channels and use this representation to analyze the performance of the algorithms using density evolution. The density evolution technique is developed based on truncating the memory of the decoding process and approximating it with a finite order Markov process, and can be implemented efficiently. As an example, we apply our technique to analyze the performance of DD-BMP on regular LDPC code ensembles, and make a number of interesting observations with regard to the performance/complexity tradeoff of DD-BMP in comparison with BP and MS algorithms. The model presented in this paper is based on certain simplifying assumptions about the memory structure of iterative algorithms such as the existence of memory only at the output of variable nodes in the code's Tanner graph rather than at both outputs of variable and check nodes. The Bayesian network framework introduced here however, can still be used to analyze the more general scenarios.