Abstract

Abstract We continue the analysis of large deviations for randomly connected neural networks used as models of the brain. The originality of the model relies on the fact that the directed impact of one particle onto another depends on the state of both particles, and they have random Gaussian amplitude with mean and variance scaling as the inverse of the network size. Similarly to the spatially extended case (see Cabana and Touboul (2018)), we show that under sufficient regularity assumptions, the empirical measure satisfies a large deviations principle with a good rate function achieving its minimum at a unique probability measure, implying, in particular, its convergence in both averaged and quenched cases, as well as a propagation of a chaos property (in the averaged case only). The class of model we consider notably includes a stochastic version of the Kuramoto model with random connections.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call