Abstract

This paper presents certain stochastic search algorithms (SSA) suitable for effective identification, optimization, and training of artificial neural networks (ANN). The modified algorithm of nonlinear stochastic search (MN-SDS) has been introduced by the author. Its basic objectives are to improve convergence property of the source defined nonlinear stochastic search (N-SDS) method as per Professor Rastrigin. Having in mind vast range of possible algorithms and procedures a so-called method of stochastic direct search (SDS) has been practiced (in the literature is called stochastic local search-SLS). The MN-SDS convergence property is rather advancing over N-SDS; namely it has even better convergence over range of gradient procedures of optimization. The SDS, that is, SLS, has not been practiced enough in the process of identification, optimization, and training of ANN. Their efficiency in some cases of pure nonlinear systems makes them suitable for optimization and training of ANN. The presented examples illustrate only partially operatively end efficiency of SDS, that is, MN-SDS. For comparative method backpropagation error (BPE) method was used.

Highlights

  • The main target of this paper is a presentation of a specific option of direct SS and its application in identification and optimisation of linear and nonlinear objects or processes

  • Why someone should go to the application of stochastic search methods (SSMs) to solve problems that arise in the optimization and training of artificial neural networks (ANN)? Our answer to this question is based on the demonstration that the SSMs, including stochastic direct search (SDS), have proved to be very productive in solving the problems of complex systems of different nature

  • The central goal of this study is the presentation of stochastic search approach applied to identification, optimization, and training of artificial neural networks

Read more

Summary

Introduction

The main target of this paper is a presentation of a specific option of direct SS and its application in identification and optimisation of linear and nonlinear objects or processes. The method of stochastic search was introduced by Ashby [1] related to gomeostat. Till 60th of last century the said gomeostat of Ashby’s was adopted mostly as philosophic concept in cybernetics trying to explain the question of stability of rather complex systems having impacts of stochastic nature [2]. The stochastic direct search (SDS) had not been noticed as advanced concurrent option for quite a long time. The researches and developments works of Professor Rastrigin and his associates promoted the SS to be competing method for solving various problems of identification and optimization of complex systems [3]. It has been shown that SDS algorithms besides being competing are even advancing over well-known methods. For systems with noise certain numerical options offer the method of stochastic approximation (MSA) [5]. In some cases procedures of SDS are more efficient than MSA [6]

Objectives
Methods
Findings
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call