We study a paradigmatic random recurrent neural network introduced by Sompolinsky, Crisanti, and Sommers (SCS). In the infinite size limit, this system exhibits a direct transition from a homogeneous rest state to chaotic behavior, with the Lyapunov exponent gradually increasing from zero. We generalize the SCS model considering odd saturating nonlinear transfer functions, beyond the usual choice ϕ(x)=tanhx. A discontinuous transition to chaos occurs whenever the slope of ϕ at 0 is a local minimum [i.e., for ϕ^{'''}(0)>0]. Chaos appears out of the blue, by an attractor-repeller fold. Accordingly, the Lyapunov exponent stays away from zero at the birth of chaos.