Abstract

How does the connectivity of a neural network (number of synapses per neuron) relate to the complexity of the problems it can handle? Switching theory would suggest no relation at all, since all Boolean functions can be implemented using a circuit with very low connectivity (e.g., using two-input NAND gates). However, for a network that learns a problem from examples using a local learning rule, we prove that the entropy of the problem becomes a lower bound for the connectivity of the network. The current result generalizes a previous result by removing a restriction on the features that are loaded into the neurons during the learning phase.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call