Abstract

A stratified model is proposed for self-similar modular neural networks includes morphological, structural, topological and parametric levels. Without loss of functionality, simplification of modular neural network models is achieved using translating connections. It is shown that the morphogenesis of the structural model is determined on the population of graded spaces of the terminal layers of the network. Structurally regular neural networks are considered. It is shown that fast transformation algorithms (including FFT) can be described by a topological model of a structurally regular self-similar network. A linguistic model for describing the topologies of regular self-similar networks is presented. An algorithm for constructing topological matrices of a matrix factorized form of fast algorithms is proposed. The sufficiency of the topological model for describing the complete set of fast algorithms is shown. Examples are given.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call