Animal behavior occurs on timescales much longer than the response times of individual neurons. In many cases, it is plausible that these long timescales emerge from the recurrent dynamics of electrical activity in networks of neurons. In linear models, timescales are set by the eigenvalues of a dynamical matrix whose elements measure the strengths of synaptic connections between neurons. It is not clear to what extent these matrix elements need to be tuned to generate long timescales; in some cases, one needs not just a single long timescale but a whole range. Starting from the simplest case of random symmetric connections, we combine maximum entropy and random matrix theory methods to construct ensembles of networks, exploring the constraints required for long timescales to become generic. We argue that a single long timescale can emerge generically from realistic constraints, but a full spectrum of slow modes requires more tuning. Langevin dynamics that generates patterns of synaptic connections drawn from these ensembles involves a combination of Hebbian learning and activity-dependent synaptic scaling.