Abstract
Neural Networks are massively parallel processing systems, that require expensive and usually not available hardware, in order to be realized. Fortunately, the development of effective and accessible software, makes their simulation easy. Thus, various neural network's implementation tools exist in the market, which are oriented to the specific learning algorithm used. Furthermore, they can simulate only fixed size networks. In this work, we present some object-oriented techniques that have been used to defined some types of neuron and network objects, that can be used to realize, in a localized approach, some fast and powerful learning algorithms which combine results of the optimal filtering and the multi-model partitioning theory. Thus, one can build and implement intelligent learning algorithms that face both, the training as well as the on-line adjustment of the network size. Furthermore, the design methodology used, results to a system modeled as a collection of concurrent executable objects, making easy the parallel implementation. The whole design results in a general purpose tool box which is characterized by maintainability, reusability, and increased modularity. The provided features are shown by the presentation of some practical applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal on Artificial Intelligence Tools
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.