The problem of global exponential stability analysis for a class of neural networks (NNs) with probabilistic delays is discussed in this paper. The delay is assumed to follow a given probability density function. This function is discretised into arbitrary number of intervals. In this way, the NN with random time delays is transformed into one with deterministic delays and random parameters. New conditions for the exponential stability of such NNs are obtained by employing new Lyapunov-Krasovskii functionals and novel techniques for achieving delay dependence. It is established that these conditions reduce the conservatism by considering not only the range of the time delays, but also the probability distribution of their variation. Numerical examples are provided to show the advantages of the proposed techniques.