Abstract

We demonstrate that the dynamics of neural networks (NNs) trained with gradient descent and the dynamics of scalar fields in a flat, vacuum energy dominated Universe are structurally profoundly related. This duality provides the framework for synergies between these systems, to understand and explain NN dynamics and new ways of simulating and describing early Universe models. Working in the continuous-time limit of NNs, we analytically match the dynamics of the mean background and the dynamics of small perturbations around the mean field, highlighting potential differences in separate limits. We perform empirical tests of this analytic description and quantitatively show the dependence of the effective field theory parameters on hyperparameters of the NN. As a result of this duality, the cosmological constant is matched inversely to the learning rate in the gradient descent update.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call