Abstract

Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP), including many related subtopics, e.g., signal propagation in NNs, theoretical derivation of learning curve for NNs, QFT methods in ML, etc. An important feature of convolutional neural networks (CNN) is their equivariance (consistency) with respect to the symmetry transformations of the input data. In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call