Abstract

This paper discusses the paradigm of fast neural networks (FNN). System invariants of fast transformations are represented. Formal linguistic methods of structure and topologies designing of FNN and linear tunable transformations are developed. The methods of tuning FNN for realization of spectral transformations, regular fractal, optimum filters are considered. The questions of using FNN in quantum calculations are investigated. The method of separating capacity estimation for weakly-connected feed-forward neural networks is offered. The dependence of amount of recognized patterns on neural network freedom degrees is obtained. The experimental results are represented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call