Abstract

Abstract Understanding how deep learning architectures work is a central scientific problem. Recently, a correspondence between neural networks (NNs) and Euclidean quantum field theories has been proposed. This work investigates this correspondence in the framework of p-adic statistical field theories (SFTs) and neural networks. In this case, the fields are real-valued functions defined on an infinite regular rooted tree with valence p, a fixed prime number. This infinite tree provides the topology for a continuous deep Boltzmann machine (DBM), which is identified with a statistical field theory on this infinite tree. In the p-adic framework, there is a natural method to discretize SFTs. Each discrete SFT corresponds to a Boltzmann machine with a tree-like topology. This method allows us to recover the standard DBMs and gives new convolutional DBMs. The new networks use O(N) parameters while the classical ones use O(N2) parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call