Abstract

A minimum-description-length (MDL) principled evolutionary mechanism to automatic architecturing multilayer feedforward (MLFF) neural network is proposed. This type of neural networks is considered as a generic system implementing a generic model. The final network resultant from the architecturing and training is seen as an instance of this generic system, and thus an implemented instance of the generic model. Disregarding the hardware fault tolerance, the description length of the network and that of the performance deviation from the ideal given training samples must be smaller than that of the original samples. This total description length must be the minimum among all possible states. Constrained by the MDL principle, an MLFF neural network can be automatically architectured and trained through an evolutionary mechanism in which the network will be allowed and enabled to automatically expand as well as to reduce its complexity of architecture. The resultant network is of a partially connected MLFF architecture. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call