Neural development must construct neural circuits that can perform the computations necessary for survival. However, many theoretical models of development do not explicitly address the computational goals of the resulting networks, or computations that evolve in time. Recurrent neural networks (RNNs) have recently come to prominence as both models of neural circuit computation and building blocks of powerful artificial intelligence systems. Here, we review progress in using RNNs for understanding how developmental processes lead to effective computations, and how abnormal development disrupts these computations.